80 Damian Collins debates involving the Department for Digital, Culture, Media & Sport

Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Mon 28th Feb 2022
Mon 17th Jan 2022

Online Safety Bill

Damian Collins Excerpts
--- Later in debate ---
Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

I welcome the new Minister to the Dispatch Box.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

And the other way, as well.

Damian Collins Portrait Damian Collins
- Hansard - -

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

Damian Collins Portrait Damian Collins
- Hansard - -

My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.

Joanna Cherry Portrait Joanna Cherry (Edinburgh South West) (SNP)
- Hansard - - - Excerpts

Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.

One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.

I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.

--- Later in debate ---
Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Damian Collins Portrait Damian Collins
- Hansard - -

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Damian Collins Portrait Damian Collins
- Hansard - -

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?

Damian Collins Portrait Damian Collins
- Hansard - -

It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.

The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.

Henry Smith Portrait Henry Smith (Crawley) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?

Damian Collins Portrait Damian Collins
- Hansard - -

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Damian Collins Portrait Damian Collins
- Hansard - -

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?

Damian Collins Portrait Damian Collins
- Hansard - -

That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is an honour to respond on the first group of amendments on behalf of the Opposition.

For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.

With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.

I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.

We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.

Alex Davies-Jones Portrait Alex Davies-Jones
- View Speech - Hansard - - - Excerpts

My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.

Damian Collins Portrait Damian Collins
- Hansard - -

I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- View Speech - Hansard - - - Excerpts

I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.

I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.

We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.

There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.

One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.

When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.

Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.

The Bill’s factsheet, which is still on the Government’s website, states on page 1:

“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.

This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.

Damian Collins Portrait Damian Collins
- Hansard - -

As I said, a company may be asked to address this, which means that it has to set out what its policies are, how it would deal with that content and its terms of service. The Bill does not require a company to remove legal speech that it has no desire to remove. The regulator cannot insist on that, nor can the Government or the Bill. There is nothing to make legal speech online illegal.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is exactly what the Minister said earlier and, indeed, said to me yesterday when we spoke about this issue. I do not deny that, but this line of argument ignores the unintended consequences that the Bill may have. Its stated aim is to achieve reductions in online harm, not just illegal content. Page 106 of the Government’s impact assessment lists a reduction in the prevalence of legal but harmful content as a “key evaluation” question. The Bill aims to reduce that—the Government say that both in the online guide and the impact assessment. The impact assessment states that an increase in “content moderation” is expected because of the Bill.

A further concern is that the large service providers already have terms and conditions that address so-called legal but harmful content. A duty to state those clearly and enforce them consistently risks legitimising and strengthening the application of those terms and conditions, possibly through automated scanning and removal. That is precisely what happened to me before the Bill was even dreamed of. That was done under an automated system, backed up by somebody in Florida, Manila or somewhere who decided that they did not like what I said. We have to bear in mind how cautious the companies will be. That is especially worrying because, as I said, providers will be under significant pressure from outside organisations to include restrictive terms and conditions. I say this to Conservative Members, and we have some very well-intentioned and very well-informed Members on these Benches: beware of the gamesmanship that will go on in future years in relation to this.

Ofcom and the Department see these measures as transparency measures—that is the line. Lord Michael Grade, who is an old friend of mine, came to see me and he talked about this not as a pressure, but as a transparency measure. However, these are actually pressure measures. If people are made to announce things and talk about them publicly, that is what they become.

It is worth noting that several free speech and privacy groups have expressed scepticism about the provisions, yet they were not called to give oral evidence in Committee. A lot of other people were, including pressure groups on the other side and the tech companies, which we cannot ignore, but free speech advocates were not.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I did of course hear what was said by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis). To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.

In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.

I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.

As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.

Damian Collins Portrait Damian Collins
- Hansard - -

I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 13(6)(b), for instance, states that the terms of service must be

“applied consistently in relation to content”,

and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.

Damian Collins Portrait Damian Collins
- Hansard - -

If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.

Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.

Damian Collins Portrait Damian Collins
- Hansard - -

On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

--- Later in debate ---
Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.

As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.

Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.

At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:

“Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”

I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.

The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.

We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.

At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.

Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with

“a significant number of United Kingdom users”.

The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.

I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.

I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, the hon. Member for Croydon South (Chris Philp), who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.

I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.

Damian Collins Portrait Damian Collins
- Hansard - -

I can assure the hon. and learned Lady that I am happy to carry on the dialogue that she had with my predecessor and meet her to discuss this at a further date.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I am delighted to hear that. I must tell the Minister that I have had a huge number of approaches from women, from lesbians and from gay men across the United Kingdom who are suffering as a result of Twitter’s moderation policy. There is a lot of support for new clause 24.

Of course, it is important to remember that the Equality Act protects everyone. Gender reassignment is there with the protected characteristics of sex and sexual orientation. It is really not acceptable for a company such as Twitter, which provides a service in the United Kingdom, to seek to flout and ignore the provisions of our domestic law on anti-discrimination. I am grateful to the Minister for the interest he has shown and for his undertaking to meet me, and I will leave it at that for now.

--- Later in debate ---
It is a modest proposal for the Bill, but it could have a major impact on the industry out there at the moment, which for many years has been completely unregulated. I do not propose pressing my new clause to a vote, but will the Minister work with his Department of Health and Social Care colleagues? Following the Health and Care Act 2022, there is a consultation on the regulations, and we could make a real difference for those I am worried about and concerned for—the more and more young people who are being bombarded with these adverts. In some cases, dangerous and potentially life-threatening procedures are being sold to them as if they are just like any other service, and they are not.
Damian Collins Portrait Damian Collins
- Hansard - -

The right hon. Gentleman makes a very important point and, as he knows, there is a wider ongoing Government review related to advertising online, which is a very serious issue. I assure him that we will follow up with colleagues in the Department of Health and Social Care to discuss the points he has raised.

Lord Beamish Portrait Mr Jones
- Hansard - - - Excerpts

I am grateful to the Minister and I will be keeping a beady eye to see how far things go. The proposal would make a difference. It is a simple but effective way of protecting people, especially young people.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark (Enfield North) (Lab)
- View Speech - Hansard - - - Excerpts

I join everyone else in the House in welcoming the Minister to his place.

I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.

Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.

Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.

There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.

In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by the hon. Member for Croydon South (Chris Philp) and the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), who highlighted that Government amendment 14 simply does not go far enough.

Damian Collins Portrait Damian Collins
- Hansard - -

On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.

Feryal Clark Portrait Feryal Clark
- View Speech - Hansard - - - Excerpts

I welcome the Minister’s clarification and look forward to the amendments being made to the Bill. Other than tying one of our hands behind our back in relation to trying to keep children safe, however, the proposals as they stand do not achieve very much. This will undermine the entire regulatory system, practically rendering it completely ineffective.

Although I welcome the Bill and some of the Government amendments, it still lacks a focus on ensuring that tech companies have the proper systems in place to fulfil their duty of care and keep our children safe. The children of this country deserve better. That is why I wholeheartedly welcome the amendments tabled by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and urge Government Members to support them.

--- Later in debate ---
Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 25 and 26 in my name. The Government rightly seek to make the UK the safest place in the world to go online, especially for our children, and some of their amendments will start to address previous gaps in the Bill. However, I believe that the Bill still falls short in its aim not only to protect children from harm and abuse, but, importantly, to empower and enable young people to make the most of the online world.

I welcome the comments that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) made about how we achieve the balance between rights and protecting children from harm. I also welcome his amendments on children’s wellbeing, which seek to achieve that balance.

With one in five children going online, keeping them safe is more difficult but more important than ever. I speak not only as the mother of two very young children who are growing up with iPads in their hands, but as—like everyone else in the Chamber—a constituency Member of Parliament who speaks regularly to school staff and parents who are concerned about the harms caused by social media in particular, but also those caused by games and other services to which children have access.

The Bill proffers a broad and vague definition of content that is legal yet harmful. As many have already said, it should not be the responsibility of the Secretary of State, in secondary legislation, to make decisions about how and where to draw the line; Parliament should set clear laws that address specific, well-defined harms, based on strong evidence. The clear difficulty that the Government have in defining what content is harmful could have been eased had the Bill focused less on removing harmful content and more on why service providers allow harmful content to spread so quickly and widely. Last year, the 5Rights Foundation conducted an experiment in which it created several fake Instagram profiles for children aged between 14 and 17. When the accounts were searched for the term “skinny”, while a warning pop-up message appeared, among the top results were

“accounts promoting eating disorders and diets, as well as pages advertising appetite-suppressant gummy bears.”

Ultimately, the business models of these services profit from the spread of such content. New clause 26 requires the Government and Ofcom to focus on ensuring that internet services are safe by design. They should not be using algorithms that give prominence to harmful content. The Bill should focus on harmful systems rather than on harmful content.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

It does focus on systems as well as content. We often talk about content because it is the exemplar for the failure of the systems, but the systems are entirely within the scope of the Bill.

Munira Wilson Portrait Munira Wilson
- View Speech - Hansard - - - Excerpts

I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.

Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.

I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term

“likely to be accessed by children”

appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Damian Collins Portrait Damian Collins
- Hansard - -

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—

Anna McMorrin Portrait Anna McMorrin
- Hansard - - - Excerpts

The Minister is not giving accurate information there. Violence against women and girls is defined by article 3 of the Council of Europe convention on preventing violence against women and domestic violence—the Istanbul convention. So there is that definition and it would be valid to put that in the Bill to ensure that all of that is covered.

Damian Collins Portrait Damian Collins
- Hansard - -

I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.

The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.

The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.

My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.

I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.

I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.

The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.

Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.

--- Later in debate ---
Ronnie Cowan Portrait Ronnie Cowan
- View Speech - Hansard - - - Excerpts

I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

We have had an important debate raising a series of extremely important topics. While the Government may not agree with the amendments that have been tabled, that is not because of a lack of seriousness of concern about the issues that have been raised.

The right hon. Member for Kingston upon Hull North (Dame Diana Johnson) spoke very powerfully. I have also met Leigh Nicol, the lady she cited, and she discussed with me the experience that she had. Sadly, it was during lockdown and it was a virtual meeting rather than face to face. There are many young women, in particular, who have experienced the horror of having intimate images shared online without their knowledge or consent and then gone through the difficult experience of trying to get them removed, even when it is absolutely clear that they should be removed and are there without their consent. That is the responsibility of the companies and the platforms to act on.

Thinking about where we are now, before the Bill passes, the requirement to deal with illegal content, even the worst illegal content, on the platforms is still largely based on the reporting of that content, without the ability for us to know how effective they are at actually removing it. That is largely based on old legislation. The Bill will move on significantly by creating proactive responsibilities not just to discover illegal content but to act to mitigate it and to be audited to see how effectively it is done. Under the Bill, that now includes not just content that would be considered to be an abuse of children. A child cannot give consent to have sex or to appear in pornographic content. Companies need to make sure that what they are doing is sufficient to meet that need.

It should be for the regulator, Ofcom, as part of putting together the codes of practice, to understand, even on more extreme content, what systems companies have in place to ensure that they are complying with the law and certainly not knowingly hosting content that has been flagged to them as being non-consensual pornography or child abuse images, which is effectively what pornography with minors would be; and to understand what systems they have in place to make sure that they are complying with the law and, as hon. Members have said, making sure that they are using available technologies in order to deliver that.

Jess Phillips Portrait Jess Phillips
- View Speech - Hansard - - - Excerpts

We have an opportunity here today to make sure that the companies are doing it. I am not entirely sure why we would not take that opportunity to legislate to make sure that they are. With the greatest of respect to the Minister back in a position of authority, it sounds an awful lot like the triumph of hope over experience.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend the Member for Birmingham, Yardley (Jess Phillips) was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.

If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.

Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.

Baroness Keeley Portrait Barbara Keeley
- View Speech - Hansard - - - Excerpts

The Minister is quoting a case that I quoted in Committee, and the former Minister, the hon. Member for Croydon South (Chris Philp), would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.

Damian Collins Portrait Damian Collins
- Hansard - -

I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.

Jess Phillips Portrait Jess Phillips
- Hansard - - - Excerpts

Just to push on this point, images of me have appeared on pornographic sites. They were not necessarily illegal images of anything bad happening to me, but other Members of Parliament in this House and I have suffered from that. Is the Minister telling me that this Bill will allow me to get in touch with that site and have an assurance that that image will be taken down and that it would be breaking the law if it did not do so?

Damian Collins Portrait Damian Collins
- Hansard - -

The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—

Damian Collins Portrait Damian Collins
- Hansard - -

Well, the regulator is required, and has the power, to take enforcement action against companies for failing to do so. That is what the legislation sets out, and we will be in a very different place from where we are now. That is why the Bill constitutes a very significant reform.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Will the Minister give way?

Damian Collins Portrait Damian Collins
- Hansard - -

Very briefly, and then I want to wrap up.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Could the Minister give me a reassurance about when consent is withdrawn? The image may initially have been there “consensually”—I would put that in inverted commas—so the platform is okay to put it there. However, if someone contacts the platform saying that they now want to change their consent—they may want to take a role in public life, having previously had a different role; I am not saying that about my hon. Friend the Member for Birmingham, Yardley (Jess Phillips)—my understanding is that there is no ability legally to enforce that content coming down. Can the Minister correct me, and if not, why is he not supporting new clause 7?

Damian Collins Portrait Damian Collins
- Hansard - -

With people who have appeared in pornographic films consensually and signed contracts to do so, that would be a very different matter from the question of intimate images being shared without consent. When someone has not consented for such images to be there, that would be a very different matter. I am saying that the Bill sets out very clearly—it did not do so in draft form—that non-consensual sexual images and extreme pornography are within the scope of the regulator’s power. The regulator should be taking action not just on what a company does to take such content down when it is discovered after the event, but on what systems the company has in place and whether it deploys all available technology to make sure that such content is never there in the first place.

Before closing, I want to touch briefly on the point raised about the Secretary of State’s powers to designate priority areas of harm. This is now under the affirmative procedure in the Bill, and it requires the approval of both Houses of Parliament. The priority illegal harms will be based on offences that already exist in law, and we are writing those priority offences into the Bill. The other priorities will be areas where the regulator will seek to test whether companies adhere to their terms of service. The new transparency requirements will set that out, and the Government have said that we will set out in more detail which of those priority areas of harm such transparency will apply to. There is still more work to be done on that, but we have given an indicative example. However, when it comes to adding a new priority illegal offence to the Bill, the premise is that it will already be an offence that Parliament has created, and writing it into the Bill will be done with the positive consent of Parliament. I think that is a substantial improvement on where the Bill was before. I am conscious that I have filled my time.

Question put, That the clause be read a Second time.

Channel 4 Privatisation

Damian Collins Excerpts
Tuesday 14th June 2022

(2 years, 4 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

The impression given is that Channel 4, as a result of being sold, will cease to exist. That is not the case. Those independent production companies are actually overloaded with work. We made more films in the UK in the last quarter of last year than were made in Hollywood. This whole sector of broadcasting and film making is booming. We are selling Channel 4 so that it can have more inward investment, not taxpayers’ money, and so that it can make more content, not less. The work will continue for independent production companies, not least from many of the companies that are coming into the UK to make films and television content, just as in Northern Ireland.

Our vision for Channel 4 is one where it continues to do all the things it does best, while being freed from the shackles that currently restrict it. I repeat: all the things it does best. That means it will continue to make diverse, interesting and edgy content with independent production companies, just as it does now.

The Opposition motion talks about protecting Channel 4’s PSB remit. Anyone who takes the time to look at our proposals will see that they pose no threat whatsoever to that PSB remit—Opposition Members talk as if there is. Under private ownership, Channel 4 will still be required to commission a minimum volume of programming from independent producers—I hope the hon. Member for Cardiff South and Penarth (Stephen Doughty) heard that—just as all other PSBs are required to do. Under private ownership, we will maintain Channel 4’s existing obligations for regional production outside London and England, just as all other PSBs are required to do. Under private ownership, Channel 4 will still be required to provide original, innovative and educational programming that represents the breadth of society, as well as primetime news and current affairs—again, just as all other PSBs are required to do. Under private ownership—that is the rub here, is it not? The words “private ownership” are the nub of it. Under private ownership, we would also have the freedom to unlock Channel 4’s full potential by removing the publisher-broadcaster restriction, which the Labour party seems to want to protect, but which is the very restriction preventing Channel 4 from achieving long-term financial security. What company pays 100% for content but does not own the content? There is no other company that would regard that as a successful business model. The restriction effectively prohibits the broadcaster from producing and selling its content, denying it a crucial way to make money.

I cannot imagine another company—I look for anyone in this House to reassure me—that would be able to survive by paying100% of the cost of the business while owning none of the product.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

In Channel 4’s own response to the Government’s “Up Next” White Paper, it proposed raising £1 billion in private money through a joint venture partner, and that the joint venture partner would retain intellectual property and programming. The idea that the status quo is sustainable is not one that Channel 4 shares, and even it has called for a radical reset of its role.

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

It is exactly as my hon. Friend has outlined. The hon. Member for Manchester Central asked me what Channel 4 said, and one of its responses was that it wants to raise money. It wants to invest and raise money. The state—[Interruption.] Channel 4 is state-owned. The state cannot own a public service broadcaster that takes on the risk of borrowing money. If that goes wrong, it is the taxpayer who has to pay that debt. We as a Government cannot burden the taxpayer with risk, potential debt and responsibility.

Removing the restriction will allow Channel 4 to do exactly what my hon. Friend the Member for Folkestone and Hythe (Damian Collins) says: to raise that revenue stream and improve its long-term sustainability. We can do all those things with a sale, while protecting all that makes Channel 4 unique. We are not looking for any old buyer for this broadcaster. We are looking for the right one—one who shares our ambition for the business and our belief in what makes it special. It is precisely because of what Channel 4 does, and how it does it, be that distinctive programming, news content or film, that we are confident that we will find the right buyer.

Unsurprisingly, though it is early days, there has already been a lot of initial interest from a wide range of potential bidders. When a sale is secured, it will not just benefit Channel 4; we intend to use the proceeds to benefit the entire country. As I said, Channel 4 was originally established to help boost independent production, and it has been successful in that mission—so successful, in fact, that we face a new and very positive challenge. Production studios across the country are booming. They are so in demand that we need more and more people to work in them. We therefore intend to funnel some of the proceeds of the sale into addressing that new challenge and giving people up and down the UK the skills and opportunity to fill those jobs, delivering a creative dividend for all.

As I have to keep reminding those who choose to ignore it, the sale of Channel 4 is just one crucial part of a much larger piece of broadcasting reform, and the question of Channel 4’s long-term sustainability is—[Interruption.] The accusation is being thrown at me from a sedentary position that I am going to get rid of the BBC. It is not good enough to invent accusations from the Front Bench. Commentary has to be based on what the Government are actually proposing and what is actually happening. [Interruption.] Okay, so we did freeze the licence fee—yes. In this environment, that is a cost of living saving. There is absolutely no way, in today’s environment, that we could go to the country and ask individuals to pay for an increase in the BBC’s licence take. I am absolutely amazed that Opposition Front Benchers think that would be an acceptable thing to do, when hard-pressed families are struggling to pay their bills—[Interruption.]

--- Later in debate ---
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

Before I start, I would like to do as the shadow Secretary of State did and declare my entry in the Register of Members’ Financial Interests. I, too, was a guest of Channel 4 at the BAFTA ceremony. I would also declare, as other Members from across the House have done, that I am a fan of “Derry Girls”, as, I am sure, as part of his cross-community work, is the hon. Member for North Antrim (Ian Paisley). This is a channel that makes great programmes that are part of our national psyche and it is an important part of our broadcasting landscape.

However, I say to Opposition Members and some on our side that I have an honest disagreement with Channel 4 and with people who are opposing privatisation; the company, although well run, is running into such strong industry headwinds that this cannot be taken off the table and it has to considered seriously. As Channel 4 said in its own “The Next Episode” response to the Government’s White Paper, all options have to be considered. That has to include the option of privatisation.

The challenges to the sector are very real. A lot has been made of the fact that the last financial year was a successful one for Channel 4 and for the UK advertising industry. There was a major spike in advertising revenues. That is partly to do with a major surge in advertising spend coming out of the pandemic, which saw a big increase in revenues for all broadcasters. The pandemic also meant the delay to the European championships and the Olympics, and such major international tournaments traditionally have a considerable inflationary impact on the advertising market. So we have to look at this in a wider context: the increases in ad revenues seen in 2021 may not be repeated; and the diversion away from linear television advertising—traditional spot advertising—to digital media is a continuing trend. Channel 4 may be the leading UK broadcaster in that respect, but currently only 16% of its revenues come from digital advertising. Although it wants to move that target to 30% by 2025, that may still be a significant challenge.

If there is a major challenge to the TV industry, to the advertising industry, and if there were a recession—TV advertising is traditionally one of the earliest and worst-hit sectors—Channel 4 would be much more vulnerable to the economic shocks that would come, because it does not have other revenue sources. These trends may be familiar across PSBs, which have seen long-term declines in revenue if they are commercial, and in audience numbers, including at peak time. However, the BBC can make money from making programmes. ITV can make money from making programmes, for itself and for other people. Channel 4 does not have that option.

Let us look at the period before the pandemic. In trying to observe a trend, that is probably the fairest thing to do, because we do not yet quite know what impact the pandemic has had, in terms of lockdown in 2020 and recovery in 2021. What does the picture look like? I think everyone here would agree that when Channel 4 was set up its purpose was to invest its money in UK original productions made by independent production companies. It was set up at a time when the BBC and the ITV companies largely made most of their stuff in house, so it was a necessary vehicle to get financial investment into the independent production sector. This was a sector where Sky, Amazon and Netflix did not exist, and it was far more reliant on that funding.

If we look at what has happened to Channel 4, and this is true for other PSBs as well, we see that in 2006 it spent £516 million in first-run original content. In 2019, the year before the pandemic, the figure was £436 million, so we have seen a 15% decline. That declining revenue also bought a lot less as well, because inflation in the TV production market is making it more and more expensive to make programmes. So in 2006 Channel 4 broadcast 3,388 hours of first-run original content, whereas in 2019 it broadcast 2,473 hours, which represents a decline of 27%. This trend away from traditional broadcasters towards digital markets, with the pressure that has on their budgets and the declining amount of money they can afford to spend on new programming, has been a trend for a number of years now. The concern we must have is that if there was a shock in the digital ad market and if Channel 4 cannot hit its targets of allowing digital revenues to grow as broadcast revenues decline, it is much more vulnerable. It does not have the reserves and it does not have the ability to make money elsewhere. That is why even Channel 4 is proposing significant changes to its remit.

Kevin Brennan Portrait Kevin Brennan
- Hansard - - - Excerpts

The hon. Gentleman says that Channel 4 is proposing this, but that proposal was a direct response to a request from the Secretary of State to propose alternative sources of revenue. It was not initiated by Channel 4 because of its concerns about its finances.

Damian Collins Portrait Damian Collins
- Hansard - -

As I pointed out earlier in the debate, in that document Channel 4 itself says that it requires a radical reset of its role. If it is to take the opportunity of the changing digital landscape in the future, it needs to be in a position to invest more money. That extra investment will not come from advertising revenues. Channel 4 has been the most successful traditional UK broadcaster in switching to digital, but even there the best one can say about the last few years is that the increase in digital revenues has just about kept pace with the decline in broadcasting revenues. Digital is not raising more money incrementally for Channel 4 to invest in programming at a time when new entrants to the market are increasing their spend significantly—by hundreds of millions of pounds. The danger is that Channel 4, with its unique voice, will be less able to compete, less able to commission, and will run less new programming than it could in the past and that other broadcasters will do. That has to be addressed.

Channel 4 has said that its role needs to be radically reset. It is calling for its digital streaming service, All 4, to be global—to reach a global audience—to increase ad revenues. That is a sensible idea, but the independent production companies that make programmes for Channel 4 would have to give their consent to being unable to sell their programming internationally on their own, as they would in other territories. It calls for the creation of a joint venture in which Channel 4 holds a minority stake that would raise £1 billion to invest in new programming over the next five years. That would be a sensible measure to bring in a significant extra boost in revenue, although it would only bring Channel 4 back to where it was in 2006. As part of that joint venture, Channel 4 would have the intellectual property rights for programming and make money from selling those programmes. Channel 4 believes that may be within its current remit, although it would significantly change the spirit of the remit. The independent production companies might have concerns about that extension, but it is probably necessary.

The idea that the status quo can continue is wrong. It would be wrong of us to assume that it can continue and to say that we will deal with this problem, if it comes, in the future, and in the meantime see Channel 4 gradually wither on the vine, with declining revenues, declining investment in programming, unable to compete, until the point where it cannot go on and requires a bail-out from the Government or the other PSBs. That is the risk we are taking.

The Government’s “Up Next” White Paper is not an ideological tract; it is a sensible and serious at look at real issues in the TV sector. We may have different views on what the right format would be; Channel 4 has put forward its ideas and other bidders will do the same. I think the bidders will be more than the traditional players; others will bid as well and we should look at those options, but they will all be options for change, suggesting a way that Channel 4 can raise more money to invest in what we want it to do—making great programmes.

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

Yes, it is the cultural levelling up that Channel 4 has been able to achieve as part of its own agenda.

Analysis by EY—Ernst and Young—which was mentioned by my hon. Friend the Member for Cardiff South and Penarth (Stephen Doughty), estimates that over £1 billion would be lost from the UK’s nations and regions if Channel 4 did not invest in the way that it does now, and that nearly 2,500 jobs in the creative sector would be at risk. That is independent analysis. It is not just those directly employed by the broadcaster who would be impacted, but the entire British creative economy. As my hon. Friend mentioned, it is a creative economy that relies on economies of scale, security of funding and a pipeline of skills.

In its lifetime, Channel 4 has invested—we have heard this already—£12 billion in the independent production sector in this country. Every year, it works with almost 300 production companies, many of which are tiny, as well as medium and large-scale production companies. This proposal does not just impact the big stars in London studios, but the camera operators, the crew runners, the location scouts and everything that makes a production happen in every single region and nation of the UK. The harsh reality is that a privatised Channel 4 would be commercially incentivised to buy in programmes from overseas instead of supporting new and innovative projects in the UK. Why? Because it costs a lot of money to make content and that would hit profits. Look at some of the big loss makers, such as the award-winning Paralympics coverage which has not really been mentioned in this debate. It is a huge loss for Channel 4 in terms of its financial viability, but it does it and it does it incredibly well.

If I could reflect on the contribution made at the end by the right hon. Member for Hereford and South Herefordshire (Jesse Norman), he made a critical point. Not only did he say that there are no options papers on where the future of Channel 4 could be beyond privatisation, but he hit the nail on the head. A lot of the contributions from the Government Benches have been about the headwinds that are just about to hit Channel 4. Those headwinds will hit Channel 4 whether it is in the public sector or private sector. It is hardly a good selling point to say, “We want to privatise one of our national assets to ensure it is not hit with these headwinds,” when a commercial broadcaster would cut the very things that Channel 4 does so well in times of hardship.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Gentleman implied that commercial companies would look to buy in programmes, rather than make them. Why is it, then, that most TV companies that have their own production studios are massively investing in making more programmes? ITV, BBC, Sky and Netflix are. Everyone recognises that the way to make money in the TV market today is to make programmes to sell, not buy in from other people.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

Yes, but Channel 4 puts all those issues on the table in terms of investing directly in production. Channel 4 provides that shop window. If you say to a production company in Scotland which makes “Location, Location, Location”, “Would you like to make that for Channel 4, but you don’t get the IP?” either the costs will shoot up or they will not make it at all. This model works. It is part of the ecosystem. Production in places such as Leeds, Salford and Scotland is working so well at the moment because we have the BBC, ITV studios and Channel 4, all different parts of that ecosystem working together, so we have the economies of scale, the skills and the ability for people to be able to invest, because they know they can make great shows and great films in those places.

Let me reflect on some of the contributions that have been made this afternoon. The Father of the House kicked us off, and was absolutely correct to say that Channel 4 does not want privatisation. The Secretary of State is essentially saying, “We know best, so we will do to Channel 4 what we think is best.” The Father of the House concluded by saying, “stop messing it around”. That is right. Why try to fix something that is not broken?

My hon. Friend the Member for Cardiff West (Kevin Brennan) in his own wonderful style did a superb “Yes Minister” characterisation. Channel 4 does not have a problem to solve, but the Government are trying to find one; he was right to call it “Parliamentary Pointless”.

I will reflect on the impacts on Scotland later in my speech, but the hon. Member for Ceredigion (Ben Lake) was right to say that privatising Channel 4 will have a huge impact on the Welsh production sector. With the BBC investing in Cardiff and Channel 4 putting productions into the city, the sector in Wales has flourished in a way that it did not before.

The hon. Member for North Antrim (Ian Paisley) is right to say that Channel 4 is an enabler. We need the big production companies to be able to make programmes in order to seed smaller production companies and the entire industry. If we do not have those productions—the big returning drama shows—it is difficult to maintain a production company in an area, which Channel 4 does so well.

The hon. Member for Aberconwy (Robin Millar) said, “D’oh!”—maybe the first person to mention Homer Simpson in the Chamber, although I am not sure whether he was impersonating Homer Simpson or recalling the motto of Downing Street.

The hon. Member for Milton Keynes North (Ben Everitt) told us that he watched Channel 4 at night when he was younger, but that he has never watched “Naked Attraction”. Mr Deputy Speaker, he needs to come to the House and correct the record, because nobody believes him! [Laughter.]

My hon. Friend the Member for Canterbury (Rosie Duffield) reeled off a list of wonderful television programmes and films that Channel 4 has made over the years, including “Drop the Dead Donkey”, or, as it was rebranded last week, “Vote of No Confidence”.

The hon. Member for Mansfield (Ben Bradley) also tried to create a problem that does not exist. Along with a number of contributors this afternoon, he said that Channel 4 should be released from its shackles to be able to borrow. Well, it has not borrowed or required to borrow in 40 years. Maybe it will not require to borrow in the next 40.

Let us not forget about film, as this is not just about the impact of privatisation on television. As we heard from my hon. Friend the Member for Manchester Central, the broadcaster is the single largest investor in British film through Film4. We can see how wonderful some of those films have been, as they have won BAFTAs every single year and have really put the British film industry on the map. I think that gets to the heart of why so many people are outraged by the Government’s proposals.

Our great nation punches so well above its weight when it comes to our cultural impact on the world. There are few better examples of that than the British stars of screen—big and small. Many of our most famous faces got their big break through Film4 productions, many of which were huge risks to Channel 4, but because of its funding model and way it was set up, it was able to take those risks and some of those productions were hugely successful. I think of Ewan McGregor, Olivia Colman and Dev Patel, who we have heard about already, and film-makers including Danny Boyle and Steve McQueen. It is little wonder that so many stars, film-makers and directors have come out against and condemned the Government’s plans.

What of training, skills and jobs? We have heard from my hon. Friend the Member for Leeds North West (Alex Sobel) on this issue. Let us not beat about the bush: getting into the television and film industry is incredibly difficult for those who are lower down the socioeconomic scale. Channel 4 has been at the forefront of helping young people to get into the industry through 4Skills, which gives 15,000 young people a year opportunities to get into the sector. That costs money and is not the kind of thing that a commercial broadcaster will do. It has an industry-leading production training scheme through its supply chain that focuses solely on social mobility. That is all at risk. Why? Because it is not protected in the White Paper.

The move to sell off Channel 4 will have a particular impact on the Scottish creative economy. Since 2007, Channel 4 has spent more than £220 million on Scottish productions—about £20 million a year in recent years. It is the key commissioner from Scottish independent production companies and other Scottish broadcasters such as Scottish Television. Channel 4’s features and daytime team, its largest creative team, is now based at its Glasgow office. The broadcaster’s emerging indie fund and its indie growth fund have provided support to fantastic Scottish production companies such as Black Camel Pictures, which was responsible for the BAFTA-winning “Sunshine on Leith”.

And who can forget “Location, Location, Location”, one of Channel 4’s most successful shows, which is produced by IWC, a Scottish production company? Maybe the Prime Minister might need Phil and Kirstie’s help in finding a new place soon. I hope so. Even TV’s most famous house hunters might struggle to find a place with a built-in karaoke bar, but that is the challenge

Channel 4’s influence is not just on Scottish television. Film4 has produced memorable Scottish hits—perhaps none more so than “Trainspotting” in my home city, even though it did not portray Edinburgh in the best of lights. Film-making brings in £600 million a year in UK-bound tourism, right across the United Kingdom, although I am not sure that “Trainspotting” did Ladbrokes toilets any good. Channel 4 has given us generation-defining entertainment, and it will again.

I am grateful for the SNP’s support for our motion. The hon. Member for Ochil and South Perthshire (John Nicolson) made an excellent speech, but I must say that Channel 4 is also under threat from the SNP’s plans for independence. It proposes to put an end to Channel 4 in Scotland, because it would be independent, and it set out in 2014 that it would do the same for the BBC. What are its proposals for Channel 4 and the BBC? Governments are attacking our public sector broadcasters because those broadcasters hold the powerful to account, whether they like it or not. They are attacking the very principle of a UK-wide public service broadcaster delivering for diverse audiences all over the country. None the less, we are grateful for the SNP’s support for the motion.

Today, though, it is for Conservatives to make their decision. As we have heard, 91% of respondents to the Government’s own consultation made their opposition to the proposals clear. Those who oppose the proposals include the advertisers that pay for advertising on Channel 4 because of the diverse audiences that it produces, which other broadcasters cannot reach. If the Secretary of State is looking for a “Countdown” of Conservative Members who do not support her proposals, I say to her, “Three from the top, two from the middle and one large one.”

Will Conservative Members vote to sell the broadcaster to a private entity that is likely to centralise creative output in London, or will they vote to continue a model that invests in our creative economy in their very own constituencies? Will they sell a cornerstone of modern British culture to the highest bidder, or will they continue a great British institution that proudly exports our culture around the world?

The country would be grateful, the industry would be grateful and viewers would be grateful if the Secretary of State scrapped this privatisation. In the words of Mrs Doyle from another of the channel’s famous shows, “Go on, go on, go on.”

Football Governance

Damian Collins Excerpts
Monday 25th April 2022

(2 years, 6 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nigel Huddleston Portrait Nigel Huddleston
- View Speech - Hansard - - - Excerpts

I am afraid the hon. Lady is misinterpreting what I have outlined today. We are pursuing a process, and we have not announced delays; we have announced a route forward. A White Paper is a perfectly reasonable step that we have to take because these are complex issues. We will move forward on all these important areas.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

If football clubs were already trading within the rules of their competitions, as my hon. Friend knows, many of them would not get into difficulty. Will he confirm the key points of principle that the regulator will have the power to access real-time financial information from the clubs to see whether they are trading within the rules and that the owners and directors test will not only apply at the point of purchase? He has spoken of licensing conditions several times. Can he confirm that, from the outset, the regulator will be issuing licences that can be rescinded if the clubs do not comply?

Nigel Huddleston Portrait Nigel Huddleston
- View Speech - Hansard - - - Excerpts

I can confirm each of those points, particularly the last one. A licensing regime is exactly that: a person must abide by the conditions in order to get a licence. My hon. Friend’s other points are similarly accurate, including on the principle of an owners and directors test. One problem is that there is an owners and directors test only when a club is sold. We will be looking at greater frequency, for the reasons he outlined.

Online Safety Bill

Damian Collins Excerpts
2nd reading
Tuesday 19th April 2022

(2 years, 6 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will not, sorry. Facebook whistleblower Frances Haugen, who I had the privilege of meeting, cited many examples to the Joint Committee on the draft Online Safety Bill of Facebook’s models and algorithms making things much worse. Had the Government chosen to follow the Joint Committee recommendations for a systems-based approach rather than a content-driven one, the Bill would be stronger and concerns about free speech would be reduced.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

Will the hon. Lady give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I am sorry, but too many people want to speak. Members should talk to their business managers, who have cut—[Interruption.] I know the hon. Gentleman was Chair of the Committee—[Interruption.]

--- Later in debate ---
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

This is an incredibly important Bill. It has huge cross-party support and was subject to scrutiny by the Joint Committee, which produced a unanimous report, which shows the widespread feeling in both Houses and on both sides of this Chamber that we should legislate. I do feel, though, that I should respond to some of the remarks of the shadow Secretary of State, the hon. Member for Manchester Central (Lucy Powell), on the Joint Committee report.

I agree with the hon. Member that, unless this legislation covers the systems of social media companies as well as the content hosted, it will not be effective, but it is my belief that it does that. Throughout the evidence that the Committee took, including from Ofcom and not just the Government, it was stated to us very clearly that the systems of social media companies are within scope and that, in preparing the risk registers for the companies, Ofcom can look at risks. For Facebook, that could include the fact that the news feed recommends content to users, while for someone on TikTok using For You, it could be the fact that the company is selecting—algorithmically ranking—content that someone might like. That could include, for a teenage girl, content that promoted self-harm that was being actively recommended by the company’s systems, or, as Frances Haugen set out, extremist content and hate speech being actively promoted and recommended by the systems.

That would be in scope. The algorithms are within scope, and part of Parliament job’s will be to ensure on an ongoing basis that Ofcom is using its powers to audit the companies in that way, to gain access to information in that way, and to say that the active promotion of regulated content by a social media company is an offence. In passing this Bill, we expect that that will be fully in scope. If the legislation placed no obligation on a company to proactively identify any copies of content that it had judged should not be there and had taken down, we would have a very ineffective system. In effect, we would have what Facebook does to assess content today. If that was effective, we would not need this legislation, but it is woefully ineffective, so the algorithms and the systems are in scope. The Bill gives Ofcom the power to regulate on that basis, and we have to ensure that it does that in preparing the risk registers.

Following what my Joint Committee colleague, the hon. Member for Bristol North West (Darren Jones), said, the point about the codes of practice is really important. The regulator sets the codes of practice for companies to follow. The Government set out in their response to the Joint Committee report that the regulator can tell companies if their response is not adequate. If an area of risk has been identified where the company has to create policies to address that risk and the response is not good enough, the regulator can still find the company in breach. I would welcome it if the Minister wished to say more about that, either today or as the Bill goes through the House, because it is really important. The response of a company to a request from the regulator, having identified a risk on its platforms, cannot be: “Oh, sorry, we don’t have a policy on that.” It has to be able to set those policies. We have to go beyond just enforcing the terms of service that companies have created for themselves. Making sure they do what they say they are going to do is really important, as the Secretary of State said, but we should be able to push them to go further.

I agree, though, with the hon. Member for Manchester Central and other hon. Members about regulation being based on risk and not just size. In reality, Ofcom will have to make judgment calls on smaller sites that are posing a huge risk or a new risk that has been identified.

The regulator will have the power to regulate Metaverse and VR platforms. Anything that is a user-to-user service is already in scope of the legislation. The challenge for the regulator will be in moderating conversations between two people in a virtual room, which is much harder than when people are posting text-based content. The technology will have to adapt to do that, but we should start that journey based on the fact that that is already in scope.

Finally, on the much used expression “legal but harmful”, I am pleased the Government took one of our big recommendations, which is to write more offences clearly into the Bill, so it is clear what is actually being regulated—so promotion of self-harm is regulated content and hate speech is part of the regulated content. The job of the regulator then is to set the threshold where intervention should come and I think that should be based on case law. On many of these issues, such as the abuse of the England footballers after the final of the European championships, people have been sentenced in court for what they did. That creates good guidance and a good baseline for what hate speech is in that context and where we would expect intervention. I think it would be much easier for the Bill, the service users that are regulated and the people who post content, to know what the offences are and where the regulatory standard is. Rather than describing those things as “legal but harmful”, we should describe them as what they are, which is regulated offences based on existing offences in law.

The Government made an important step in responding to say that the Government, in seeking amendment to the codes of practice that bring new offences within scope of these priority areas of harm, should have to go through an affirmative process in both Houses. That is really important. Ultimately, the regulation should be based on our laws and changes should be based on decisions taken in this House.

None Portrait Several hon. Members rose—
- Hansard -

Online Abuse

Damian Collins Excerpts
Monday 28th February 2022

(2 years, 8 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - -

It is a pleasure to speak in the debate and address the petitions on online abuse, which reflect the understandable growing public concern about the prevalence of abusive material and behaviour online, in particular on social media. I chaired the Joint Committee on the Draft Online Safety Bill—the pre-legislative scrutiny Committee. We had consistent evidence from people about the nature of the abuse they had faced online, and how that abuse creates spaces online where not only are people targeted, but hate speech, racist speech, vile abuse and extremism have become normalised within the echo chambers of certain sectors of social media.

People are frustrated because they have raised these concerns directly with social media platforms, their Member of Parliament and the Government. The sort of action that they think should be taken to combat abusive behaviour, and that would be taken if the behaviour took place in a public space, is not being taken online. People cannot just log out of their accounts. We cannot just say, “Well, the easiest way to avoid online abuse is not to be online.” Many people are required to be online due to the nature of their work. Why should people not be able to go online and enjoy, as the hon. Member for Newcastle upon Tyne North (Catherine McKinnell) said, the connectivity that comes from being on social media without being the victim or target of abuse? We do not create spaces for abuse in the physical world; we should not create such spaces in the online world either.

As the hon. Member said, the draft Online Safety Bill is big and quite complicated. I urge everyone to read the 60,000-word report by the Joint Committee. The Bill becomes a lot simpler and clearer after reading the report. At the heart of the Bill is something simple: activity that is illegal offline should be regulated online, and the laws that Parliament has created and that our courts enforce offline should be applied online as well. Offences are committed not just by someone who is abusing another person, but by a platform that actively hosts, amplifies and creates an audience for that content, and platforms should be liable to combat abuse. Indeed, without that liability, we will not be able to combat it.

Online platforms should not be amplifying abusive behaviour to others. In a shocking incident last summer—one of many—members of the English national football team were the subject of racist abuse after the final of the European championships. That was foreseen by the football authorities, who warned social media companies that this would be a problem, but the companies did nothing to combat it. The issue is not just that foreseeable abuse was not acted on and stopped; it is that social media companies’ systems were making people aware that the abuse was taking place, and even highlighting the key words. People were prosecuted in the courts following that event because of their use of racist language, and the racial abuse directed towards others.

We know what the offences are, and we know what the thresholds are for legal action. That standard should be enforced online. The draft Online Safety Bill should, by regulating illegal activity online, set the minimum standards required. It should allow us to set the safety standards that we think should be set, so that they are based not on terms of service written by American technology companies in California, but on our laws and the thresholds we set to keep people safe. They should be proactively enforced. The company should not be waiting for people to complain; they should be proactively looking for this content.

I will focus my remarks on a part of the Bill that it is important to get right: priority harms. The term that has been used is “legal but harmful”. Clearly illegal content exists; content that constitutes child abuse or a terrorist offence is clearly illegal and no context is needed to understand why it is bad. The Government propose creating a schedule of priority harms of other offences. I am pleased to see the steps the Government have taken since the Joint Committee report was published; they are writing more of those offences into the Bill, and making it absolutely clear when they apply. However, we do need certainty; victims need to know that an offence has been committed. They should be seeking redress based on the fact that an offence has been committed against them. The social media companies need to know which offences are in scope, and what they are expected to do in different situations. The regulator needs to give certainty, based on the law and its regulatory powers, about where those thresholds are.

That is why the Joint Committee recommended removing the definition of “legal but harmful” from the Bill, and instead writing into the Bill those offences that it applies to, making it really clear which offences are in scope. Once the regulator is established, its first job will be to create the risk registers from which it will create the codes of practice that apply to the different companies. Then the regulator will be well placed to act if Ofcom feels that there are gaps in the regulatory regime—if there are offences that should have been included but were not, or if new offences need to be added to the priority harm areas. It is much better if offences in the regulatory regime are based on laws that are understood, that Parliament has passed and that the courts enforce, so that there is no ambiguity. I understand the desire to future-proof the legislation, and to say, “Something bad may happen that we cannot foresee; there should be provision to regulate for that.” However, it is difficult to take enforcement action against a social media company for not acting against content that was not proscribed or recognised by the regulator.

The Government have done a good job of bringing so many offences into the Bill. One of the concerns was whether equalities legislation was enforceable online—how would we enforce race hate, and other abuse, online? The Government have made clear how that could be done. However, it would be better if the regulatory regime was based on offences named in the Bill, rather than our having an additional general definition of something that is “legal but harmful”.

I do not believe that the Government’s intention is for the regulator to start creating new offences; the Government want to bring clarity. Having a tight focus on existing offences, as regulated through the codes of practice that set the minimum safety standards, gives more clarity and more certainty. That is what people want. They want to know what they are being protected from, that the companies have to be proactive in removing certain sorts of content, and that there is no ambiguity or confusion over what that is and how it should be done. When the Minister responds and the Government give their thoughts about the final draft of the draft Online Safety Bill, they should bear that in mind. The Bill will provide a lot more clarity if we give people certainty—both those who are concerned about freedom of expression, and those who want to know that certain offences will be covered by the Bill.

English Football League Governance: Derby County FC

Damian Collins Excerpts
Tuesday 18th January 2022

(2 years, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Chris Philp Portrait Chris Philp
- View Speech - Hansard - - - Excerpts

The hon. Gentleman raises an important issue. A number of people have concerns about the role that agents play—not least football clubs, managers and indeed sometimes players themselves. It is a slightly, and I choose my words diplomatically, opaque—I was going to say murky—business. As the Sports Minister responds to the fan-led review, this will be an issue that he addresses.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

It is the failure of football governance that has created the problem at Derby. In this crisis, once again, the fans see that no one is interested in their concerns, the long-term future of the club or the impact on the people of Derby. If the EFL had enforced its financial rules effectively, this would not have happened, yet it is the EFL’s rules that will trigger Derby’s expulsion from the league by insisting that all football debts and liabilities are met. The regulator that my hon. Friend the Member for Chatham and Aylesford (Tracey Crouch) proposes in her report will stop this happening, but not before 1 February. Is the Minister prepared to consider what other direct interventions DCMS might make to keep Derby County in business?

Chris Philp Portrait Chris Philp
- View Speech - Hansard - - - Excerpts

If the current system had been functioning perfectly or properly, there would have been no need for the fan-led review. There are certainly shortcomings, as my hon. Friend points out, which the fan-led review is designed to address. On the way in which the EFL’s rules may have precipitated or triggered the current situation, I repeat my call and, I think, the call of all hon. Members on both sides of the House for pragmatism from those involved, including the EFL, to get this matter resolved as quickly as possible to save a great club.

BBC Funding

Damian Collins Excerpts
Monday 17th January 2022

(2 years, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will write to the right hon. Gentleman with those specific figures.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

Eleven years ago, the then director-general of the BBC said the corporation would have to do “fewer things better.” The current director-general has challenged the organisation to prioritise how it spends its money and maximises its commercial revenues. Does the Secretary of State agree that reform is necessary for the BBC to thrive in the digital age in both how it works and how it is funded?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

We cannot ignore the fact that our digital landscape is transforming and advancing at a rapid pace, which has resulted in people, and especially the younger generation, changing their viewing habits. I would be accused of being a dinosaur if I stood here and said we should just let the BBC carry on as it is with this licence fee model. As my right hon. Friend the Member for Maldon (Mr Whittingdale) highlighted, 700,000 fewer people are paying the licence fee. We have to do something now to sustain the BBC and maintain this British beacon. We have to act now to ensure the BBC remains the BBC and is here for the future.

Draft Online Safety Bill Report

Damian Collins Excerpts
Thursday 13th January 2022

(2 years, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

I beg to move,

That this House has considered the Report of the Joint Committee on the draft Online Safety Bill, HC 609.

I would like to start by thanking the members and Clerks of our Joint Committee, who put in a tremendous effort to deliver its report. In 11 sitting weeks, we received more than 200 submissions of written evidence, took oral evidence from 50 witnesses and held four further roundtable meetings with outside experts, as well as Members of both Houses. I am delighted to see my Joint Committee colleagues Lord Gilbert and Baroness Kidron in the Gallery. I thank the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Croydon South (Chris Philp), and the Secretary of State for the open and collaborative way in which they worked with the Committee throughout the process and our deliberations. I also thank Ofcom, which provided a lot of constructive guidance and advice to the Committee as we prepared the report.

This feels like a moment that has been a long time coming. There has been huge interest on both sides of the House in the Online Safety Bill ever since the publication of the first White Paper in April 2019, and then there were two Government responses, the publication of the draft Bill and a process of pre-legislative scrutiny by the Joint Committee. I feel that the process has been worth while: in producing a unanimous report, I think the Committee has reflected the wide range of opinions that we received and put forward some strong ideas that will improve the Bill, which I hope will get a Second Reading later in the Session. I believe that it has been a process worth undertaking, and many other Lords and Commons Committees have been looking at the same time at the important issues around online safety and the central role that online services play in our lives.

The big tech companies have had plenty of notice that this is coming. During that period, have we seen a marked improvement? Have we seen the introduction of effective self-regulation? Have the companies set a challenge to Parliament, saying “You don’t really need to pass this legislation, because we are doing all we can already”? No. If anything, the problems have got worse. Last year, we saw an armed insurrection in Washington DC in which a mob stormed the Capitol building, fuelled by messages of hate and confrontation that circulated substantially online. Last summer, members of the England football team were subject to vile racist abuse at the end of the final—the football authorities had warned the companies that that could happen, but they did not prepare for it or act adequately at the time.

As Rio Ferdinand said in evidence to the Joint Committee, people should not have to put up with this. People cannot just put their device down—it is a tool that they use for work and to stay in communication with their family and friends—so they cannot walk away from the abuse. If someone is abused in a room, they can leave the room, but they cannot walk away from a device that may be the first thing that they see in the morning and one of the last things that they see at night.

We have seen an increase in the incidence of child abuse online. The Internet Watch Foundation has produced a report today that shows that yet again there are record levels of abusive material related to children, posing a real child safety risk. It said the same in its report last year, and the issues are getting worse. Throughout the pandemic, we have seen the rise of anti-vaccine conspiracies.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- View Speech - Hansard - - - Excerpts

I commend the hon. Gentleman for bringing this forward. We have a colleague in Northern Ireland, Diane Dodds MLA, who has had unbelievably vile abuse towards her and her family. Does the hon. Gentleman agree that there is a huge loophole and gap in this Bill—namely, that the anonymity clause remains that allows comments such as those to my colleague and friend Diane Dodds, which were despicable in the extreme? There will be no redress and no one held accountable through this Bill. The veil of anonymity must be lifted and people made to face the consequences of what they are brave enough to type but not to say.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. The hon. Gentleman is not trying to make a speech, is he? No, he is not.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Gentleman raises an important issue. The Committee agreed in the report that there must be an expedited process of transparency, so that when people are using anonymity to abuse other people—saying things for which in public they might be sued or have action taken against them—it must be much easier to swiftly identify who those people are. People must know that if they post hate online directed at other people and commit an offence in doing so, their anonymity will not be a shield that will protect them: they will be identified readily and action taken against them. Of course there are cases where anonymity may be required, when people are speaking out against an oppressive regime or victims of abuse are telling their story, but it should not be used as a shield to abuse others. We set that out in the report and the hon. Gentleman is right that the Bill needs to move on it.

We are not just asking the companies to moderate content; we are asking them to moderate their systems as well. Their systems play an active role in directing people towards hate and abuse. A study commissioned by Facebook showed that over 60% of people who joined groups that showed extremist content did so at the active recommendation of the platform itself. In her evidence to the Committee, Facebook whistleblower Frances Haugen made clear the active role of systems in promoting and driving content through to people, making them the target of abuse, and making vulnerable people more likely to be confronted with and directed towards content that will exacerbate their vulnerabilities.

Facebook and companies like it may not have invented hate but they are driving hate and making it worse. They must be responsible for these systems. It is right that the Bill will allow the regulator to hold those companies to account not just for what they do or do not take down, but for the way they use the systems that they have created and designed to make money for themselves by keeping people on them longer, such that they are responsible for them. The key thing at the heart of the Bill and at the heart of the report published by the Joint Committee is that the companies must be held liable for the systems they have created. The Committee recommended a structural change to the Bill to make it absolutely clear that what is illegal offline should be regulated online. Existing offences in law should be written into the Bill and it should be demonstrated how the regulator will set the thresholds for enforcement of those measures online.

This approach has been made possible because of the work of the Law Commission in producing its recommendations, particularly in introducing new offences around actively promoting self-harm and promoting content and information that is known to be false. A new measure will give us the mechanism to deal with malicious deepfake films being targeted at people. There are also necessary measures to make sure that there are guiding principles that the regulator has to work to, and the companies have to work to, to ensure regard to public health in dealing with dangerous disinformation relating to the pandemic or other public health issues.

We also have to ensure an obligation for the regulator to uphold principles of freedom of expression. It is important that effective action should be taken against hate speech, extremism, illegal content and all harmful content that is within the scope of the Bill, but if companies are removing content that has every right to be there—where the positive expression of people’s opinions has every right to be online—then the regulator should have the power to intervene in that direction as well.

At the heart of the regime has to be a system where Ofcom, as the independent regulator, can set mandatory codes and standards that we expect the companies to meet, and then use its powers to investigate and audit them to make sure that they are complying. We cannot have a system that is based on self-declared transparency reports by the companies where even they themselves struggle to explain what the results mean and there is no mechanism for understanding whether they are giving us the full picture or only a highly partial one. The regulator must have that power. Crucially, the codes of practice should set the mandatory minimum standards. We should not have Silicon Valley deciding what the online safety of citizens in this country should be. That should be determined through legislation passed through this Parliament empowering the regulator to set the minimum standards and take enforcement action when they have not been met.

We also believe that the Bill would be improved by removing a controversial area, the principles in clause 11. The priority areas of harm are determined by the Secretary of State and advisory to the companies. If we base the regulatory regime and the codes of practice on established offences that this Parliament has already created, which are known and understood and therefore enforced, we can say they are mandatory and clear and that there has been a parliamentary approval process in creating the offences in the first place.

Where new areas of harm are added to the schedules and the codes of practice, there should be an affirmative procedure in both Houses of Parliament to approve those changes to the code, so that Members have the chance to vote on changes to the codes of practice and the introduction of new offences as a consequence of those offences being created.

The Committee took a lot of evidence on the question of online fraud and scams. We received evidence from the Work and Pensions Committee and the Treasury Committee advising us that this should be done: if a known scam or attempt to rip off and defraud people is present on a website or social media platform, be it through advertising or any kind of posting, it should be within the scope and it should be for the regulator to require its removal. There should not be a general blanket exemption for advertising, which would create a perverse incentive to promote such content more actively.

Kevin Hollinrake Portrait Kevin Hollinrake (Thirsk and Malton) (Con)
- View Speech - Hansard - - - Excerpts

I thank my hon. Friend for his work on this important issue. Does he agree, as referred to in the report, that platforms must be required to proactively seek out that content and ensure it is changed, and if not, remove it, rather than all removals being prompted by users?

Damian Collins Portrait Damian Collins
- View Speech - Hansard - -

It is vital that companies are made to act proactively. That is one of the problems with the current regime, where action against illegal content is only required once it is reported to the companies and they are not proactively identifying it. My hon. Friend is right about that, particularly with frauds and scams where the perpetrators are known. The role of the regulator is to ensure that companies do not run those ads. The advertising authorities can still take action against individual advertisers, as can the police, but there should be a proactive responsibility on the platforms themselves.

If you will allow me to say one or two more things, Madam Deputy Speaker, we believe it is important that there should be user redress through the system. That is why the Committee recommended creating an ombudsman if complaints have been exhausted without successful resolution, but also permitting civil redress through the courts.

If an individual or their family has been greatly harmed as a consequence of what they have seen on social media, they may take some solace in the fact that the regulator has intervened against the company for its failures and levied fines or taken action against individual directors. However, as an individual can take a case to the courts for a company’s failure to meet its obligations under data protection law, that should also apply to online safety legislation. An individual should have the right, on their own or with others, to sue a company for failing to meet its obligations under an online safety Act.

I commend the report to the House and thank everyone involved in its production for their hard work. This is a Bill we desperately need, and I look forward to seeing it pass through the House in this Session.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - -

I thank all Members who contributed to what has been an excellent debate. We have heard from Members from each nation of the United Kingdom and almost every political party represented in the House as well, all of whom were supporting the principle of the Bill and supporting a majority of the recommendations in the report. I think we all share an urgency that we want to get this done.

Members spoke not just out of an appreciation of the policy issues, but from personal experience. The right hon. Member for Barking (Dame Margaret Hodge) talked about the abuse that she has received, as so many other Members of the House have. The hon. Member for Reading East (Matt Rodda) raised a case on behalf of his constituents and my hon. Friend the Member for Bosworth (Dr Evans) did so with regards to his campaign on body image. We know ourselves, from our personal experience and the experience of our constituents, why it is necessary for legislation on this. There is also a question about how the House scrutinises the powers we will give Ofcom and how the regime will work in the future.

Question put and agreed to.

Resolved,

That this House has considered the Report of the Joint Committee on the draft Online Safety Bill, HC 609.

Independent Fan-led Review of Football Governance

Damian Collins Excerpts
Thursday 25th November 2021

(2 years, 11 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

I welcome the excellent report that my hon. Friend the Member for Chatham and Aylesford (Tracey Crouch) has prepared and I welcome as well the fact that the Minister has said that, in principle, the Government accept the creation of the independent regulator. Obviously, it is vital that the independent regulator, when created, has the powers that it needs to do the job. Can he confirm that, in principle, the Government accept that that must include real-time access to financial information about the clubs if we are to prevent more club failures?

Nigel Huddleston Portrait Nigel Huddleston
- View Speech - Hansard - - - Excerpts

I thank my hon. Friend as well for his commitment, interest and insight into football and, indeed, into sport in general over many years and I appreciate what he is saying. Yes, I can say that, of course, we could not have an effective regulator without also having adequate powers, and the elements that he has considered will, of course, be part of that package. When I say that we accept in principle and are therefore considering moving forward with legislation that includes not only the regulator itself, but the powers that the regulator may have.

Oral Answers to Questions

Damian Collins Excerpts
Thursday 18th November 2021

(2 years, 11 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I take this opportunity to thank the Digital, Culture, Media and Sport Committee for the work that it has undertaken, particularly gathering the evidence from Frances Haugen and others. We have taken a huge body of evidence. The Joint Committee is doing that very work at the moment. I am confident that every one of the examples that the hon. Gentleman has just highlighted will be legislated for in the regulatory framework, which will be given to Ofcom, to regulate those online platforms once the Bill becomes law. I appreciate his interest. I would also appreciate his input when the Bill passes through the House.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- View Speech - Hansard - -

Does the Secretary of State agree that the key principle of the online safety Bill should be that offences that exist offline should be applied online—not only to those who post content with the intent of harming others, but to the platforms that host such content—and that we need to have ongoing close parliamentary scrutiny of which offences should apply and how?

Nadine Dorries Portrait Ms Dorries
- View Speech - Hansard - - - Excerpts

This is a novel and groundbreaking Bill that will legislate in a way that has never been done before, in a new sector and a new environment. Ongoing scrutiny on a regular basis once the Bill becomes an Act will be extremely important. We will look at how we are going to manage that within the Bill.