--- Later in debate ---
Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

I welcome the new Minister to the Dispatch Box.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend the Member for Haltemprice and Howden (Mr Davis)? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - - - Excerpts

And the other way, as well.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.

Joanna Cherry Portrait Joanna Cherry (Edinburgh South West) (SNP)
- Hansard - - - Excerpts

Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.

One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.

I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.

--- Later in debate ---
Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, the hon. Member for Croydon South (Chris Philp)—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.

The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.

Henry Smith Portrait Henry Smith (Crawley) (Con)
- View Speech - Hansard - - - Excerpts

I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is an honour to respond on the first group of amendments on behalf of the Opposition.

For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.

With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.

I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.

We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.

Alex Davies-Jones Portrait Alex Davies-Jones
- View Speech - Hansard - - - Excerpts

My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- View Speech - Hansard - - - Excerpts

I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.

I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.

We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from the hon. Member for Pontypridd (Alex Davies-Jones), who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.

There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.

One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend the Member for Croydon South (Chris Philp), the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.

When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.

Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.

The Bill’s factsheet, which is still on the Government’s website, states on page 1:

“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.

This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said, a company may be asked to address this, which means that it has to set out what its policies are, how it would deal with that content and its terms of service. The Bill does not require a company to remove legal speech that it has no desire to remove. The regulator cannot insist on that, nor can the Government or the Bill. There is nothing to make legal speech online illegal.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is exactly what the Minister said earlier and, indeed, said to me yesterday when we spoke about this issue. I do not deny that, but this line of argument ignores the unintended consequences that the Bill may have. Its stated aim is to achieve reductions in online harm, not just illegal content. Page 106 of the Government’s impact assessment lists a reduction in the prevalence of legal but harmful content as a “key evaluation” question. The Bill aims to reduce that—the Government say that both in the online guide and the impact assessment. The impact assessment states that an increase in “content moderation” is expected because of the Bill.

A further concern is that the large service providers already have terms and conditions that address so-called legal but harmful content. A duty to state those clearly and enforce them consistently risks legitimising and strengthening the application of those terms and conditions, possibly through automated scanning and removal. That is precisely what happened to me before the Bill was even dreamed of. That was done under an automated system, backed up by somebody in Florida, Manila or somewhere who decided that they did not like what I said. We have to bear in mind how cautious the companies will be. That is especially worrying because, as I said, providers will be under significant pressure from outside organisations to include restrictive terms and conditions. I say this to Conservative Members, and we have some very well-intentioned and very well-informed Members on these Benches: beware of the gamesmanship that will go on in future years in relation to this.

Ofcom and the Department see these measures as transparency measures—that is the line. Lord Michael Grade, who is an old friend of mine, came to see me and he talked about this not as a pressure, but as a transparency measure. However, these are actually pressure measures. If people are made to announce things and talk about them publicly, that is what they become.

It is worth noting that several free speech and privacy groups have expressed scepticism about the provisions, yet they were not called to give oral evidence in Committee. A lot of other people were, including pressure groups on the other side and the tech companies, which we cannot ignore, but free speech advocates were not.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I did of course hear what was said by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis). To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.

In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.

I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.

As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 13(6)(b), for instance, states that the terms of service must be

“applied consistently in relation to content”,

and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.

Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

--- Later in debate ---
Joanna Cherry Portrait Joanna Cherry
- View Speech - Hansard - - - Excerpts

That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.

As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.

Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.

At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:

“Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”

I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.

The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.

We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.

At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.

Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with

“a significant number of United Kingdom users”.

The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.

I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.

I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, the hon. Member for Croydon South (Chris Philp), who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.

I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I can assure the hon. and learned Lady that I am happy to carry on the dialogue that she had with my predecessor and meet her to discuss this at a further date.

Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I am delighted to hear that. I must tell the Minister that I have had a huge number of approaches from women, from lesbians and from gay men across the United Kingdom who are suffering as a result of Twitter’s moderation policy. There is a lot of support for new clause 24.

Of course, it is important to remember that the Equality Act protects everyone. Gender reassignment is there with the protected characteristics of sex and sexual orientation. It is really not acceptable for a company such as Twitter, which provides a service in the United Kingdom, to seek to flout and ignore the provisions of our domestic law on anti-discrimination. I am grateful to the Minister for the interest he has shown and for his undertaking to meet me, and I will leave it at that for now.

--- Later in debate ---
It is a modest proposal for the Bill, but it could have a major impact on the industry out there at the moment, which for many years has been completely unregulated. I do not propose pressing my new clause to a vote, but will the Minister work with his Department of Health and Social Care colleagues? Following the Health and Care Act 2022, there is a consultation on the regulations, and we could make a real difference for those I am worried about and concerned for—the more and more young people who are being bombarded with these adverts. In some cases, dangerous and potentially life-threatening procedures are being sold to them as if they are just like any other service, and they are not.
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Gentleman makes a very important point and, as he knows, there is a wider ongoing Government review related to advertising online, which is a very serious issue. I assure him that we will follow up with colleagues in the Department of Health and Social Care to discuss the points he has raised.

Lord Beamish Portrait Mr Jones
- Hansard - - - Excerpts

I am grateful to the Minister and I will be keeping a beady eye to see how far things go. The proposal would make a difference. It is a simple but effective way of protecting people, especially young people.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark (Enfield North) (Lab)
- View Speech - Hansard - - - Excerpts

I join everyone else in the House in welcoming the Minister to his place.

I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.

Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.

Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.

There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.

In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by the hon. Member for Croydon South (Chris Philp) and the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), who highlighted that Government amendment 14 simply does not go far enough.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On the amendments that the former Minister, my hon. Friend the Member for Croydon South (Chris Philp), spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.

Feryal Clark Portrait Feryal Clark
- View Speech - Hansard - - - Excerpts

I welcome the Minister’s clarification and look forward to the amendments being made to the Bill. Other than tying one of our hands behind our back in relation to trying to keep children safe, however, the proposals as they stand do not achieve very much. This will undermine the entire regulatory system, practically rendering it completely ineffective.

Although I welcome the Bill and some of the Government amendments, it still lacks a focus on ensuring that tech companies have the proper systems in place to fulfil their duty of care and keep our children safe. The children of this country deserve better. That is why I wholeheartedly welcome the amendments tabled by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and urge Government Members to support them.

--- Later in debate ---
Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 25 and 26 in my name. The Government rightly seek to make the UK the safest place in the world to go online, especially for our children, and some of their amendments will start to address previous gaps in the Bill. However, I believe that the Bill still falls short in its aim not only to protect children from harm and abuse, but, importantly, to empower and enable young people to make the most of the online world.

I welcome the comments that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) made about how we achieve the balance between rights and protecting children from harm. I also welcome his amendments on children’s wellbeing, which seek to achieve that balance.

With one in five children going online, keeping them safe is more difficult but more important than ever. I speak not only as the mother of two very young children who are growing up with iPads in their hands, but as—like everyone else in the Chamber—a constituency Member of Parliament who speaks regularly to school staff and parents who are concerned about the harms caused by social media in particular, but also those caused by games and other services to which children have access.

The Bill proffers a broad and vague definition of content that is legal yet harmful. As many have already said, it should not be the responsibility of the Secretary of State, in secondary legislation, to make decisions about how and where to draw the line; Parliament should set clear laws that address specific, well-defined harms, based on strong evidence. The clear difficulty that the Government have in defining what content is harmful could have been eased had the Bill focused less on removing harmful content and more on why service providers allow harmful content to spread so quickly and widely. Last year, the 5Rights Foundation conducted an experiment in which it created several fake Instagram profiles for children aged between 14 and 17. When the accounts were searched for the term “skinny”, while a warning pop-up message appeared, among the top results were

“accounts promoting eating disorders and diets, as well as pages advertising appetite-suppressant gummy bears.”

Ultimately, the business models of these services profit from the spread of such content. New clause 26 requires the Government and Ofcom to focus on ensuring that internet services are safe by design. They should not be using algorithms that give prominence to harmful content. The Bill should focus on harmful systems rather than on harmful content.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It does focus on systems as well as content. We often talk about content because it is the exemplar for the failure of the systems, but the systems are entirely within the scope of the Bill.

Munira Wilson Portrait Munira Wilson
- View Speech - Hansard - - - Excerpts

I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.

Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.

I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term

“likely to be accessed by children”

appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—

Anna McMorrin Portrait Anna McMorrin
- Hansard - - - Excerpts

The Minister is not giving accurate information there. Violence against women and girls is defined by article 3 of the Council of Europe convention on preventing violence against women and domestic violence—the Istanbul convention. So there is that definition and it would be valid to put that in the Bill to ensure that all of that is covered.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.

The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.

The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.

My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.

I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.

I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.

The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.

Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.

--- Later in debate ---
16:30

Division 35

Ayes: 226

Noes: 292

Clause 5
--- Later in debate ---
16:45

Division 36

Ayes: 229

Noes: 294

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am anticipating another Division, as I said, and then I understand there may be some points of order, which I will hear after that Division.

That concludes proceedings on new clauses, new schedules and amendments to those parts of the Bill that have to be concluded by 4.30 pm.

It has been pointed out to me that, in this unusually hot weather, Members should please remember to drink more water. I tried it myself once. [Laughter.]

In accordance with the programme (No. 2) order of today, we now come to new clauses, new schedules and amendments relating to those parts of the Bill to be concluded by 7 pm. We begin with new clause 14, which the House has already debated. I therefore call the Minister to move new clause 14 formally.

New Clause 14

Providers’ judgements about the status of content

“(1) This section sets out the approach to be taken where—

(a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or

(b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.

(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.

(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—

(a) the size and capacity of the provider, and

(b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.

(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is—

(a) illegal content, or illegal content of a particular kind, or

(b) a fraudulent advertisement.

(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).

(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—

(a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and

(b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).

(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).

(9) In this section—

“fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question);

“illegal content” has the same meaning as in Part 3 (see section 52);

“relevant requirements” means—

(a) duties and requirements under this Act, and

(b) requirements of a notice given by OFCOM under this Act.”—(Damian Collins.)

This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.

Brought up.

Question put, That the clause be added to the Bill.

--- Later in debate ---
16:59

Division 37

Ayes: 288

Noes: 229

New clause 14 read a Second time, and added to the Bill.
--- Later in debate ---
Ronnie Cowan Portrait Ronnie Cowan
- View Speech - Hansard - - - Excerpts

I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

We have had an important debate raising a series of extremely important topics. While the Government may not agree with the amendments that have been tabled, that is not because of a lack of seriousness of concern about the issues that have been raised.

The right hon. Member for Kingston upon Hull North (Dame Diana Johnson) spoke very powerfully. I have also met Leigh Nicol, the lady she cited, and she discussed with me the experience that she had. Sadly, it was during lockdown and it was a virtual meeting rather than face to face. There are many young women, in particular, who have experienced the horror of having intimate images shared online without their knowledge or consent and then gone through the difficult experience of trying to get them removed, even when it is absolutely clear that they should be removed and are there without their consent. That is the responsibility of the companies and the platforms to act on.

Thinking about where we are now, before the Bill passes, the requirement to deal with illegal content, even the worst illegal content, on the platforms is still largely based on the reporting of that content, without the ability for us to know how effective they are at actually removing it. That is largely based on old legislation. The Bill will move on significantly by creating proactive responsibilities not just to discover illegal content but to act to mitigate it and to be audited to see how effectively it is done. Under the Bill, that now includes not just content that would be considered to be an abuse of children. A child cannot give consent to have sex or to appear in pornographic content. Companies need to make sure that what they are doing is sufficient to meet that need.

It should be for the regulator, Ofcom, as part of putting together the codes of practice, to understand, even on more extreme content, what systems companies have in place to ensure that they are complying with the law and certainly not knowingly hosting content that has been flagged to them as being non-consensual pornography or child abuse images, which is effectively what pornography with minors would be; and to understand what systems they have in place to make sure that they are complying with the law and, as hon. Members have said, making sure that they are using available technologies in order to deliver that.

Jess Phillips Portrait Jess Phillips
- View Speech - Hansard - - - Excerpts

We have an opportunity here today to make sure that the companies are doing it. I am not entirely sure why we would not take that opportunity to legislate to make sure that they are. With the greatest of respect to the Minister back in a position of authority, it sounds an awful lot like the triumph of hope over experience.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend the Member for Birmingham, Yardley (Jess Phillips) was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.

If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.

Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.

Baroness Keeley Portrait Barbara Keeley
- View Speech - Hansard - - - Excerpts

The Minister is quoting a case that I quoted in Committee, and the former Minister, the hon. Member for Croydon South (Chris Philp), would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.

Jess Phillips Portrait Jess Phillips
- Hansard - - - Excerpts

Just to push on this point, images of me have appeared on pornographic sites. They were not necessarily illegal images of anything bad happening to me, but other Members of Parliament in this House and I have suffered from that. Is the Minister telling me that this Bill will allow me to get in touch with that site and have an assurance that that image will be taken down and that it would be breaking the law if it did not do so?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Well, the regulator is required, and has the power, to take enforcement action against companies for failing to do so. That is what the legislation sets out, and we will be in a very different place from where we are now. That is why the Bill constitutes a very significant reform.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Will the Minister give way?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Very briefly, and then I want to wrap up.

Lloyd Russell-Moyle Portrait Lloyd Russell-Moyle
- Hansard - - - Excerpts

Could the Minister give me a reassurance about when consent is withdrawn? The image may initially have been there “consensually”—I would put that in inverted commas—so the platform is okay to put it there. However, if someone contacts the platform saying that they now want to change their consent—they may want to take a role in public life, having previously had a different role; I am not saying that about my hon. Friend the Member for Birmingham, Yardley (Jess Phillips)—my understanding is that there is no ability legally to enforce that content coming down. Can the Minister correct me, and if not, why is he not supporting new clause 7?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

With people who have appeared in pornographic films consensually and signed contracts to do so, that would be a very different matter from the question of intimate images being shared without consent. When someone has not consented for such images to be there, that would be a very different matter. I am saying that the Bill sets out very clearly—it did not do so in draft form—that non-consensual sexual images and extreme pornography are within the scope of the regulator’s power. The regulator should be taking action not just on what a company does to take such content down when it is discovered after the event, but on what systems the company has in place and whether it deploys all available technology to make sure that such content is never there in the first place.

Before closing, I want to touch briefly on the point raised about the Secretary of State’s powers to designate priority areas of harm. This is now under the affirmative procedure in the Bill, and it requires the approval of both Houses of Parliament. The priority illegal harms will be based on offences that already exist in law, and we are writing those priority offences into the Bill. The other priorities will be areas where the regulator will seek to test whether companies adhere to their terms of service. The new transparency requirements will set that out, and the Government have said that we will set out in more detail which of those priority areas of harm such transparency will apply to. There is still more work to be done on that, but we have given an indicative example. However, when it comes to adding a new priority illegal offence to the Bill, the premise is that it will already be an offence that Parliament has created, and writing it into the Bill will be done with the positive consent of Parliament. I think that is a substantial improvement on where the Bill was before. I am conscious that I have filled my time.

Question put, That the clause be read a Second time.

--- Later in debate ---
18:17

Division 38

Ayes: 220

Noes: 285

Clause 34
--- Later in debate ---
18:34

Division 39

Ayes: 188

Noes: 283

Clause 193