2nd reading
Tuesday 19th April 2022

(2 years, 8 months ago)

Commons Chamber
Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Watch Debate Read Debate Ministerial Extracts
[Relevant Documents: Report of the Joint Committee on the Draft Online Safety Bill, Session 2021-22: Draft Online Safety Bill, HC 609, and the Government Response, CM 640; Eighth Report of the Digital, Culture, Media and Sport Committee, The Draft Online Safety Bill and the legal but harmful debate, HC 1039, and the Government response HC 1221; Second Report of the Digital, Culture, Media and Sport Committee, Session 2019-21, Misinformation in the COVID-19 Infodemic, HC 234, and the Government response, HC 894; Second Report of the Petitions Committee, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Eleventh Report of the Treasury Committee, Economic Crime, HC 145; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions; e-petition 601932, Do not restrict our right to freedom of expression online.]
Second Reading
19:36
Nadine Dorries Portrait The Secretary of State for Digital, Culture, Media and Sport (Ms Nadine Dorries)
- View Speech - Hansard - - - Excerpts

I beg to move, That the Bill be now read a Second time.

Given the time and the number of people indicating that they wish to speak, and given that we will have my speech, the shadow Minister’s speech and the two winding-up speeches, there might be 10 minutes left for people to speak. I will therefore take only a couple of interventions and speak very fast in the way I can, being northern.

Almost every aspect of our lives is now conducted via the internet, from work and shopping to keeping up with our friends, family and worldwide real-time news. Via our smartphones and tablets, we increasingly spend more of our lives online than in the real world.

In the past 20 years or so, it is fair to say that the internet has overwhelmingly been a force for good, for prosperity and for progress, but Members on both sides of the House will agree that, as technology advances at warp speed, so have the new dangers this progress presents to children and young people.

Mark Francois Portrait Mr Mark Francois (Rayleigh and Wickford) (Con)
- Hansard - - - Excerpts

My right hon. Friend will know that, last Wednesday, the man who murdered our great friend Sir David Amess was sentenced to a whole-life term. David felt very strongly that we need legislation to protect MPs, particularly female MPs, from vile misogynistic abuse. In his memory, will she assure me that her Bill will honour the spirit of that request?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Sir David was a friend to all of us, and he was very much at the forefront of my mind during the redrafting of this Bill over the last few months. I give my right hon. Friend my absolute assurance on that.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

A number of constituents have contacted me over the last few months about eating disorders, particularly anorexia and bulimia, and about bullying in schools. Will the Secretary of State assure me and this House that those concerns will be addressed by this Bill so that my constituents are protected?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

They will. Inciting people to take their own life or encouraging eating disorders in anorexia chatrooms—all these issues are covered by the Bill.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention.

Jonathan Gullis Portrait Jonathan Gullis (Stoke-on-Trent North) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend, and I thank her for her written communications regarding Angela Stevens, the mother of Brett, who tragically took his own life having been coerced by some of these vile online sites. The Law Commission considered harmful online communications as part of the Bill’s preparation, and one of its recommendations is to introduce a new offence of encouraging or assisting self-harm. I strongly urge my right hon. Friend to adopt that recommendation. Can she say more on that?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Yes. Exactly those issues will be listed in secondary legislation, under “legal but harmful”. I will talk about that further in my speech, but “legal but harmful” focuses on some of the worst harmful behaviours. We are talking not about an arbitrary list, but about incitement to encourage people to take their own life and encouraging people into suicide chatrooms—behaviour that is not illegal but which is indeed harmful.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I am going to whizz through my speech now in order to allow people who have stayed and want to speak to do so.

As the Minister for mental health for two years, too often, I heard stories such as the one just highlighted by my hon. Friend the Member for Stoke-on-Trent North (Jonathan Gullis). We have all sat down with constituents and listened as the worst stories any parents could recount were retold: stories of how 14-year-old girls take their own life after being directed via harmful algorithms into a suicide chatroom; and of how a child has been bombarded with pro-anorexia content, or posts encouraging self-harm or cyber-bullying.

School bullying used to stop at the school gate. Today, it accompanies a child home, on their mobile phone, and is lurking in the bedroom waiting when they switch on their computer. It is the last thing a bullied child reads at night before they sleep and the first thing they see when they wake in the morning. A bullied child is no longer bullied in the playground on school days; they are bullied 24 hours a day, seven days a week. Childhood innocence is being stolen at the click of a button. One extremely worrying figure from 2020 showed that 80% of 12 to 15-year-olds had at least one potentially harmful online experience in the previous year.

We also see this every time a footballer steps on to the pitch, only to be subjected to horrific racism online, including banana and monkey emojis. As any female MP in this House will tell you, a woman on social media—I say this from experience—faces a daily barrage of toxic abuse. It is not criticism—criticism is a fair game—but horrific harassment and serious threats of violence. Trolls post that they hope we get raped or killed, urge us to put a rope around our neck, or want to watch us burn in a car alive—my own particular experience.

All this behaviour is either illegal or, almost without exception, explicitly banned in a platform’s terms and conditions. Commercially, it has to be. If a platform stated openly that it allowed such content on its sites, which advertisers, its financial lifeblood, would knowingly endorse and advertise on it? Which advertisers would do that? Who would openly use or allow their children to use sites that state that they allow illegal and harmful activity? None, I would suggest, and platforms know that. Yet we have almost come to accept this kind of toxic behaviour and abuse as part and parcel of online life. We have factored online abuse and harm into our daily way of life, but it should not and does not have to be this way.

This Government promised in their manifesto to pass legislation to tackle these problems and to make the UK the

“safest place in the world to be online”

especially for children. We promised legislation that would hold social media platforms to the promises they have made to their own users—their own stated terms and conditions—promises that too often are broken with no repercussions. We promised legislation that would bring some fundamental accountability to the online world. That legislation is here in the form of the ground- breaking Online Safety Bill. We are leading the way and free democracies across the globe are watching carefully to see how we progress this legislation.

The Bill has our children’s future, their unhindered development and their wellbeing at its heart, while at the same time providing enhanced protections for freedom of speech. At this point, I wish to pay tribute to my predecessors, who have each trodden the difficult path of balancing freedom of speech and addressing widespread harms, including my immediate predecessor and, in particular, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), who worked so hard, prior to my arrival in the Department for Digital, Culture, Media and Sport, with stakeholders and platforms, digging in to identify the scope of the problem.

Let me summarise the scope of the Bill. We have reserved our strongest measures in this legislation for children. For the first time, platforms will be required under law to protect children and young people from all sorts of harm, from the most abhorrent child abuse to cyber-bullying and pornography. Tech companies will be expected to use every possible tool to do so, including introducing age-assurance technologies, and they will face severe consequences if they fail in the most fundamental of requirements to protect children. The bottom line is that, by our passing this legislation, our youngest members of society will be far safer when logging on. I am so glad to see James Okulaja and Alex Holmes from The Diana Award here today, watching from the Gallery as we debate this groundbreaking legislation. We have worked closely with them as we have developed the legislation, as they have dedicated a huge amount of their time to protecting children from online harms. This Bill is for them and those children.

The second part of the Bill makes sure that platforms design their services to prevent them from being abused by criminals. When illegal content does slip through the net, such as child sex abuse and terrorist content, they will need to have effective systems and processes in place to quickly identify it and remove it from their sites. We will not allow the web to be a hiding place or a safe space for criminals. The third part seeks to force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all sorts of other unacceptable behaviour that they claim not to allow but that ruins life in practice. In other words, we are just asking the largest platforms to simply do what they say they will do, as we do in all good consumer protection measures in any other industry. If platforms fail in any of those basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down the full weight of the law upon them.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take just two more interventions and that will be it, otherwise people will not have a chance to speak.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I am very grateful to my right hon. Friend for giving way. The internet giants that run the kind of awful practices that she has described have for too long been unaccountable, uncaring and unconscionable in the way they have fuelled every kind of spite and fed every kind of bigotry. Will she go further in this Bill and ensure that, rather like any other publisher, if those companies are prepared to allow anonymous posts, they are held accountable for those posts and subject to the legal constraints that a broadcaster or newspaper would face?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

These online giants will be held accountable to their own terms and conditions. They will be unable any longer to allow illegal content to be published, and we will also be listing in secondary legislation offences that will be legal but harmful. We will be holding those tech giants to account.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I thank the Secretary of State for giving way. She talked about how this Bill is going to protect children much more, and it is a welcome step forward. However, does she accept that there are major gaps in this Bill? For instance, gaming is not covered. It is not clear whether things such as virtual reality and the metaverse are going to be covered. [Interruption.] It is not clear and all the experts will tell us that. The codes of practice in the Bill are only recommended guidance; they are not mandatary and binding on companies. That will encourage a race to the bottom.

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

The duties are mandatory; it is the Online Safety Bill and the metaverse is included in the Bill. Not only is it included, but, moving forward, the provisions in the Bill will allow us to move swiftly with the metaverse and other things. We did not even know that TikTok existed when this Bill started its journey. These provisions will allow us to move quickly to respond.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention, but that is it.

Damian Green Portrait Damian Green (Ashford) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend for giving way. One of the most important national assets that needs protecting in this Bill and elsewhere is our reputation for serious journalism. Will she therefore confirm that, as she has said outside this House, she intends to table amendments during the passage of the Bill that will ensure that platforms and search engines that have strategic market status protect access to journalism and content from recognised news publishers, ensuring that it is not moderated, restricted or removed without notice or right of appeal, and that those news websites will be outside the scope of the Bill?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

We have already done that—it is already in the Bill.

Daniel Kawczynski Portrait Daniel Kawczynski (Shrewsbury and Atcham) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

No, I have to continue.

Not only will the Bill protect journalistic content, democratic content and democratic free speech, but if one of the tech companies wanted to take down journalistic content, the Bill includes a right of appeal for journalists, which currently does not exist. We are doing further work on that to ensure that content remains online while the appeal takes place. The appeal process has to be robust and consistent across the board for all the appeals that take place. We have already done more work on that issue in this version of the Bill and we are looking to do more as we move forward.

As I have said, we will not allow the web to be a hiding place or safe space for criminals and when illegal content does slip through the net—such as child sex abuse and terrorist content— online platforms will need to have in place effective systems and processes to quickly identify that illegal content and remove it from their sites.

The third measure will force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all the other unacceptable behaviours. In other words, we are asking the largest platforms to do what they say they will do, just as happens with all good consumer-protection measures in any other industry. Should platforms fail in any of their basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down upon them the full weight of the law. Such action includes searching platforms’ premises and confiscating their equipment; imposing huge fines of up to 10% of their global turnover; pursuing criminal sanctions against senior managers who fail to co-operate; and, if necessary, blocking their sites in the UK.

We know that tech companies can act very quickly when they want to. Last year, when an investigation revealed that Pornhub allowed child sexual exploitation and abuse imagery to be uploaded to its platform, Mastercard and Visa blocked the use of their cards on the site. Lo and behold, threatened with the prospect of losing a huge chunk of its profit, Pornhub suddenly removed nearly 10 million child sexual exploitation videos from its site overnight. These companies have the tools but, unfortunately, as they have shown time and again, they need to be forced to use them. That is exactly what the Bill will do.

Before I move on, let me point out something very important: this is not the same Bill as the one published in draft form last year. I know that Members throughout the House are as passionate as I am about getting this legislation right, and I had lots of constructive feedback on the draft version of the Bill. I have listened carefully to all that Members have had to say throughout the Bill’s process, including by taking into account the detailed feedback from the Joint Committee, the Digital, Culture, Media and Sport Committee and the Petitions Committee. They have spent many hours considering every part of the Bill, and I am extremely grateful for their dedication and thorough recommendations on how the legislation could be improved.

As a result of that feedback process, over the past three months or so I have strengthened the legislation in a number of important ways. There were calls for cyber-flashing to be included; cyber-flashing is now in the Bill. There were calls to ensure that the legislation covered all commercial pornography sites; in fact, we have expanded the Bill’s scope to include every kind of provider of pornography. There were concerns about anonymity, so we have strengthened the Bill so that it now requires the biggest tech platforms to offer verification and empowerment tools for adult users, allowing people to block anonymous trolls from the beginning.

I know that countless MPs are deeply concerned about how online fraud—particularly scam ads—has proliferated over the past few years. Under the new version of the Bill, the largest and highest-risk companies—those that stand to make the most profit—must tackle scam ads that appear on their services.

We have expanded the list of priority offences named on the face of the legislation to include not just terrorism and child abuse imagery but revenge porn, fraud, hate crime, encouraging and assisting suicide, and organised immigration crime, among other offences.

If anyone doubted our appetite to go after Silicon Valley executives who do not co-operate with Ofcom, they will see that we have strengthened the Bill so that the criminal sanctions for senior managers will now come into effect as soon as possible after Royal Assent— I am talking weeks, not years. We have expanded the things for which those senior managers will be criminally liable to cover falsifying data, destroying data and obstructing Ofcom’s access to their premises.

In addition to the regulatory framework in the Bill that I have described, we are creating three new criminal offences. While the regulatory framework is focused on holding companies to account, the criminal offences will be focused on individuals and the way people use and abuse online communications. Recommended by the Law Commission, the offences will address coercive and controlling behaviour by domestic abusers; threats to rape, kill or inflict other physical violence; and the sharing of dangerous disinformation deliberately to inflict harm.

This is a new, stronger Online Safety Bill. It is the most important piece of legislation that I have ever worked on and it has been a huge team effort to get here. I am confident that we have produced something that will protect children and the most vulnerable members of society while being flexible and adaptable enough to meet the challenges of the future.

Let me make something clear in relation to freedom of speech. Anyone who has actually read the Bill will recognise that its defining focus is the tackling of serious harm, not the curtailing of free speech or the prevention of adults from being upset or offended by something they have seen online. In fact, along with countless others throughout the House, I am seriously concerned about the power that big tech has amassed over the past two decades and the huge influence that Silicon Valley now wields over public debate.

We in this place are not the arbiters of free speech. We have left it to unelected tech executives on the west coast to police themselves. They decide who is and who is not allowed on the internet. They decide whose voice should be heard and whose should be silenced—whose content is allowed up and what should be taken down. Too often, their decisions are arbitrary and inconsistent. We are left, then, with a situation in which the president of the United States can be banned by Twitter while the Taliban is not; in which talkRADIO can be banned by YouTube for 12 hours; in which an Oxford academic, Carl Heneghan, can be banned by Twitter; or in which an article in The Mail on Sunday can be plastered with a “fake news” label—all because they dared to challenge the west coast consensus or to express opinions that Silicon Valley does not like.

It is, then, vital that the Bill contains strong protections for free speech and for journalistic content. For the first time, under this legislation all users will have an official right to appeal if they feel their content has been unfairly removed. Platforms will have to explain themselves properly if they remove content and will have special new duties to protect journalistic content and democratically important content. They will have to keep those new duties in mind whenever they set their terms and conditions or moderate any content on their sites. I emphasise that the protections are new. The new criminal offences update section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003, which were so broad that they interfered with free speech while failing to address seriously harmful consequences.

Without the Bill, social media companies would be free to continue to arbitrarily silence or cancel those with whom they do not agree, without any need for explanation or justification. That situation should be intolerable for anyone who values free speech. For those who quite obviously have not read the Bill and say that it concedes power to big tech companies, I have this to say: those big tech companies have all the power in the world that they could possibly want, right now. How much more power could we possibly concede?

That brings me to my final point. We now face two clear options. We could choose not to act and leave big tech to continue to regulate itself and mark its own homework, as it has been doing for years with predictable results. We have already seen that too often, without the right incentives, tech companies will not do what is needed to protect their users. Too often, their claims about taking steps to fix things are not backed up by genuine actions.

I can give countless examples from the past two months alone of tech not taking online harm and abuse seriously, wilfully promoting harmful algorithms or putting profit before people. A recent BBC investigation showed that women’s intimate pictures were being shared across the platform Telegram to harass, shame and blackmail women. The BBC reported 100 images to Telegram as pornography, but 96 were still accessible a month later. Tech did not act.

Twitter took six days to suspend the account of rapper Wiley after his disgusting two-day antisemitic rant. Just last week, the Centre for Countering Digital Hate said that it had reported 253 accounts to Instagram as part of an investigation into misogynistic abuse on the platform, but almost 90% remained active a month later. Again, tech did not act.

Remember: we have been debating these issues for years. They were the subject of one of my first meetings in this place in 2005. During that time, things have got worse, not better. If we choose the path of inaction, it will be on us to explain to our constituents why we did nothing to protect their children from preventable risks, such as grooming, pornography, suicide content or cyber-bullying. To those who say protecting children is the responsibility of parents, not the job of the state, I would quote the 19th-century philosopher John Stuart Mill, one of the staunchest defenders of individual freedom. He wrote in “On Liberty” that the role of the state was to fulfil the responsibility of the parent in order to protect a child where a parent could not. If we choose not to act, in the years to come we will no doubt ask ourselves why we did not act to impose fundamental online protections.

However, we have another option. We can pass this Bill and take huge steps towards tackling some of the most serious forms of online harm: child abuse, terrorism, harassment, death threats, and content that is harming children across the UK today. We could do what John Stuart Mill wrote was the core duty of Government. The right to self-determination is not unlimited. An action that results in doing harm to another is not only wrong, but wrong enough that the state can intervene to prevent that harm from occurring. We do that in every other part of our life. We erect streetlamps to make our cities and towns safer. We put speed limits on our roads and make seatbelts compulsory. We make small but necessary changes to protect people from grievous harm. Now it is time to bring in some fundamental protections online.

We have the legislation ready right now in the form of the Online Safety Bill. All we have to do is pass it. I am proud to commend the Bill to the House.

None Portrait Several hon. Members rose—
- Hansard -

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. Before I call the shadow Secretary of State, it will be obvious to the House that we have approximately one hour for Back-Bench contributions and that a great many people want to speak. I warn colleagues that not everybody will have the opportunity and that there will certainly be a time limit, which will probably begin at five minutes.

20:02
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. It has been a busy day, and I will try to keep my remarks short. It is a real shame that the discussion of an important landmark Bill, with so many Members wanting to contribute, has been squeezed into such a tiny amount of time.

Labour supports the principles of the Online Safety Bill. There has been a wild west online for too long. Huge platforms such as Facebook and Google began as start-ups but now have huge influence over almost every aspect of our lives: how we socialise and shop, where we get our news and views, and even the outcomes of elections and propaganda wars. There have been undoubted benefits, but the lack of regulation has let harms and abuses proliferate. From record reports of child abuse to soaring fraud and scams, from racist tweets to Russia’s disinformation campaigns, there are too many harms that, as a society, we have been unable or unwilling to address.

There is currently no regulator. However, neither the Government nor silicon valley should have control over what we can say and do online. We need strong, independent regulation.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will give way once on this point.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I am grateful. The Secretary of State talked about getting the tech giants to follow their own rules, but we know from Frances Haugen, the Facebook whistleblower, that companies were driving children and adults to harmful content, because it increased engagement. Does that not show that we must go even further than asking them to follow their own rules?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I very much agree with my hon. Friend, and I will come on to talk about that shortly.

The Online Safety Bill is an important step towards strong, independent regulation. We welcome the Bill’s overall aim: the duty of care framework based on the work of the Carnegie Trust. I agree with the Secretary of State that the safety of children should be at the heart of this regulation. The Government have rightly now included fraud, online pornography and cyber-flashing in the new draft of the Bill, although they should have been in scope all along.

Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I am not going to give way, sorry.

Before I get onto the specifics, I will address the main area of contention: the balance between free speech and regulation, most notably expressed via the “legal but harmful” clauses.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will give way one last time.

Christian Wakeford Portrait Christian Wakeford
- Hansard - - - Excerpts

I thank my hon. Friend. The Government have set out the priority offences in schedule 7 to the Bill, but legal harms have clearly not been specified. Given the torrent of racist, antisemitic and misogynistic abuse that grows every single day, does my hon. Friend know why the Bill has not been made more cohesive with a list of core legal harms, allowing for emerging threats to be dealt with in secondary legislation?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will come on to some of those issues. My hon. Friend makes a valid point.

I fear the Government’s current solution to the balance between free speech and regulation will please no one and takes us down an unhelpful rabbit hole. Some believe the Bill will stifle free speech, with platforms over-zealously taking down legitimate political and other views. In response, the Government have put in what they consider to be protections for freedom of speech and have committed to setting out an exhaustive list of “legal but harmful” content, thus relying almost entirely on a “take down content” approach, which many will still see as Government overreach.

On the other hand, those who want harmful outcomes addressed through stronger regulation are left arguing over a yet-to-be-published list of Government-determined harmful content. This content-driven approach moves us in the wrong direction away from the “duty of care” principles the Bill is supposed to enshrine. The real solution is a systems approach based on outcomes, which would not only solve the free speech question, but make the Bill overall much stronger.

What does that mean in practice? Essentially, rather than going after individual content, go after the business models, systems and policies that drive the impact of such harms—[Interruption.] The Minister for Security and Borders, the right hon. Member for East Hampshire (Damian Hinds), says from a sedentary position that that is what the Bill does, but none of the leading experts in the field think the same. He should talk to some of them before shouting at me.

The business models of most social media companies are currently based on engagement, as my hon. Friend the Member for Liverpool, Walton (Dan Carden) outlined. The more engagement, the more money they make, which rewards controversy, sensationalism and fake news. A post containing a racist slur or anti-vax comment that nobody notices, shares or reads is significantly less harmful than a post that is quickly able to go viral. A collective pile-on can have a profoundly harmful effect on the young person on the receiving end, even though most of the individual posts would not meet the threshold of harmful.

Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way on that point?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I will not, sorry. Facebook whistleblower Frances Haugen, who I had the privilege of meeting, cited many examples to the Joint Committee on the draft Online Safety Bill of Facebook’s models and algorithms making things much worse. Had the Government chosen to follow the Joint Committee recommendations for a systems-based approach rather than a content-driven one, the Bill would be stronger and concerns about free speech would be reduced.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

I am sorry, but too many people want to speak. Members should talk to their business managers, who have cut—[Interruption.] I know the hon. Gentleman was Chair of the Committee—[Interruption.]

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. The hon. Lady is not giving way. Let us get on with the debate.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

The business managers have failed everybody on both sides given the time available.

A systems-based approach also has the benefit of tackling the things that platforms can control, such as how content spreads, rather than what they cannot control, such as what people post. We would avoid the cul-de-sac of arguing over the definitions of what content is or is not harmful, and instead go straight to the impact. I urge the Government to adopt the recommendations that have been made consistently to focus the Bill on systems and models, not simply on content.

Turning to other aspects of the Bill, key issues with its effectiveness remain. The first relates to protecting children. As any parent will know, children face significant risks online, from poor body image, bullying and sexist trolling to the most extreme grooming and child abuse, which is, tragically, on the rise. This Bill is an important opportunity to make the internet a safe place for children. It sets out duties on platforms to prevent children from encountering illegal, harmful or pornographic content. That is all very welcome.

However, despite some of the Government’s ambitious claims, the Bill still falls short of fully protecting children. As the National Society for the Prevention of Cruelty to Children argues, the Government have failed to grasp the dynamics of online child abuse and grooming—[Interruption.] Again, I am being heckled from the Front Bench, but if Ministers engage with the children’s charities they will find a different response. For example—[Interruption.] Yes, but they are not coming out in support of the Bill, are they? For example, it is well evidenced that abusers will often first interact with children on open sites and then move to more encrypted platforms. The Government should require platforms to collaborate to reduce harm to children, prevent abuse from being displaced and close loopholes that let abusers advertise to each other in plain sight.

The second issue is illegal activity. We can all agree that what is illegal offline should be illegal online, and all platforms will be required to remove illegal content such as terrorism, child sex abuse and a range of other serious offences. It is welcome that the Government have set out an expanded list, but they can and must go further. Fraud was the single biggest crime in the UK last year, yet the Business Secretary dismissed it as not affecting people’s everyday lives.

The approach to fraud in this Bill has been a bit like the hokey-cokey: the White Paper said it was out, then it was in, then it was out again in the draft Bill and finally it is in again, but not for the smaller sites or the search services. The Government should be using every opportunity to make it harder for scammers to exploit people online, backed up by tough laws and enforcement. What is more, the scope of this Bill still leaves out too many of the Law Commission’s recommendations of online crimes.

The third issue is disinformation. The war in Ukraine has unleashed Putin’s propaganda machine once again. That comes after the co-ordinated campaign by Russia to discredit the truth about the Sergei Skripal poisonings. Many other groups have watched and learned: from covid anti-vaxxers to climate change deniers, the internet is rife with dangerous disinformation. The Government have set up a number of units to tackle disinformation and claim to be working with social media companies to take it down. However, that is opaque and far from optimal. The only mention of disinformation in the Bill is that a committee should publish a report. That is far from enough.

Returning to my earlier point, it is the business models and systems of social media companies that create a powerful tool for disinformation and false propaganda to flourish. Being a covid vaccine sceptic is one thing, but being able to quickly share false evidence dressed up as science to millions of people within hours is a completely different thing. It is the power of the platform that facilitates that, and it is the business models that encourage it. This Bill hardly begins to tackle those societal and democratic harms.

The fourth issue is online abuse. From racism to incels, social media has become a hotbed for hate. I agree with the Secretary of State that that has poisoned public life. I welcome steps to tackle anonymous abuse. However, we still do not know what the Government will designate as legal but harmful, which makes it very difficult to assess whether the Bill goes far enough, or indeed too far. I worry that those definitions are left entirely to the Secretary of State to determine. A particularly prevalent and pernicious form of online hate is misogyny, but violence against women and girls is not mentioned at all in the Bill—a serious oversight.

The decision on which platforms will be regulated by the Bill is also arbitrary and flawed. Only the largest platforms will be required to tackle harmful content, yet smaller platforms, which can still have a significant, highly motivated, well-organised and particularly harmful user base, will not. Ofcom should regulate based on risk, not just on size.

The fifth issue is that the regulator and the public need the teeth to take on the big tech companies, with all the lawyers they can afford. It is a David and Goliath situation. The Bill gives Ofcom powers to investigate companies and fine them up to 10% of their turnover, and there are some measures to help individual users. However, if bosses in Silicon Valley are to sit up and take notice of this Bill, it must go further. It should include stronger criminal liability, protections for whistleblowers, a meaningful ombudsman for individuals, and a route to sue companies through the courts.

The final issue is future-proofing, which we have heard something about already. This Bill is a step forward in dealing with the likes of Twitter, Facebook and Instagram—although it must be said that many companies have already begun to get their house in order ahead of any legislation—but it will have taken nearly six years for the Bill to appear on the statute book.

Since the Bill was first announced, TikTok has emerged on the scene, and Facebook has renamed itself Meta. The metaverse is already posing dangers to children, with virtual reality chat rooms allowing them to mix freely with predatory adults. Social media platforms are also adapting their business models to avoid regulation; Twitter, for example, says that it will decentralise and outsource moderation. There is a real danger that when the Bill finally comes into effect, it will already be out of date. A duty of care approach, focused on outcomes rather than content, would create a much more dynamic system of regulation, able to adapt to new technologies and platforms.

In conclusion, social media companies are now so powerful and pervasive that regulating them is long overdue. Everyone agrees that the Bill should reduce harm to children and prevent illegal activity online, yet there are serious loopholes, as I have laid out. Most of all, the focus on individual content rather than business models, outcomes and algorithms will leave too many grey areas and black spots, and will not satisfy either side in the free speech debate.

Despite full prelegislative scrutiny, the Government have been disappointingly reluctant to accept those bigger recommendations. In fact, they are going further in the wrong direction. As the Bill progresses through the House, we will work closely with Ministers to improve and strengthen it, to ensure that it truly becomes a piece of world-leading legislation.

None Portrait Several hon. Members rose—
- Hansard -

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

We will begin with a time limit of five minutes, but that is likely to reduce.

20:16
Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Some colleagues have been in touch with me to ask my view on one overriding matter relating to this Bill: does it impinge on our civil liberties and our freedom of speech? I say to colleagues that it does neither, and I will explain how I have come to that conclusion.

In the mid-1990s, when social media and the internet were in their infancy, the forerunners of the likes of Google scored a major win in the United States. Effectively, they got the US Congress to agree to the greatest “get out of jail free” card in history: namely, to agree that social media platforms are not publishers and are not responsible for the content they carry. That has led to a huge flowering of debate, knowledge sharing and connections between people, the likes of which humanity has never seen before. We should never lose sight of that in our drive to fairly regulate this space. However, those platforms have also been used to cause great harm in our society, and because of their “get out of jail free” card, the platforms have not been accountable to society for the wrongs that are committed through them.

That is quite simplistic. I emphasise that as time has gone by, social media platforms have to some degree recognised that they have responsibilities, and that the content they carry is not without impact on society—the very society that they make their profits from, and that nurtured them into existence. Content moderation has sprung up, but it has been a slow process. It is only a few years ago that Google, a company whose turnover is higher than the entire economy of the Netherlands, was spending more on free staff lunches than on content moderation.

Content moderation is decided by algorithms, based on terms and conditions drawn up by the social media companies without any real public input. That is an inadequate state of affairs. Furthermore, where platforms have decided to act, there has been little accountability, and there can be unnecessary takedowns, as well as harmful content being carried. Is that democratic? Is it transparent? Is it right?

These masters of the online universe have a huge amount of power—more than any industrialist in our history—without facing any form of public scrutiny, legal framework or, in the case of unwarranted takedowns, appeal. I am pleased that the Government have listened in part to the recommendations published by the Digital, Culture, Media and Sport Committee, in particular on Parliament’s being given control through secondary legislation over legal but harmful content and its definition—an important safeguard for this legislation. However, the Committee and I still have queries about some of the Bill’s content. Specifically, we are concerned about the risks of cross-platform grooming and bread- crumbing—perpetrators using seemingly innocuous content to trap a child into a sequence of abuse. We also think that it is a mistake to focus on category 1 platforms, rather than extending the provisions to other platforms such as Telegram, which is a major carrier of disinformation. We need to recalibrate to a more risk-based approach, rather than just going by the numbers. These concerns are shared by charities such as the National Society for the Prevention of Cruelty to Children, as the hon. Member for Manchester Central (Lucy Powell) said.

On a systemic level, consideration should be given to allowing organisations such as the Internet Watch Foundation to identify where companies are failing to meet their duty of care, in order to prevent Ofcom from being influenced and captured by the heavy lobbying of the tech industry. There has been reference to the lawyers that the tech industry will deploy. If we look at any newspaper or LinkedIn, we see that right now, companies are recruiting, at speed, individuals who can potentially outgun regulation. It would therefore be sensible to bring in outside elements to provide scrutiny, and to review matters as we go forward.

On the culture of Ofcom, there needs to be greater flexibility. Simply reacting to a large number of complaints will not suffice. There needs to be direction and purpose, particularly with regard to the protection of children. We should allow for some forms of user advocacy at a systemic level, and potentially at an individual level, where there is extreme online harm.

On holding the tech companies to account, I welcome the sanctions regime and having named individuals at companies who are responsible. However, this Bill gives us an opportunity to bring about real culture change, as has happened in financial services over the past two decades. During Committee, the Government should actively consider the suggestion put forward by my Committee—namely, the introduction of compliance officers to drive safety by design in these companies.

Finally, I have concerns about the definition of “news publishers”. We do not want Ofcom to be effectively a regulator or a licensing body for the free press. However, I do not want in any way to do down this important and improved Bill. I will support it. It is essential. We must have this regulation in place.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker, but I was under the impression that I was to wind up for my party, rather than speaking at this juncture.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

If the hon. Gentleman would prefer to save his slot until later—

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I would, Madam Deputy Speaker, if that is all right with you.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

Then we shall come to that arrangement. I call Dame Margaret Hodge.

20:22
Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. I hope that I will take only three minutes.

The human cost of abuse on the internet is unquantifiable—from self-harm to suicide, grooming to child abuse, and racism to misogyny. A space we thought gave the unheard a legitimate voice has become a space where too many feel forced to stay offline. As a Jewish female politician online, I have seen my identities perversely tied together to discredit my character and therefore silence my voice. I am regularly accused of being a “Zionist hag”, a “paedophile” and a “Nazi”. But this is not just about politicians. We all remember the tsunami of racism following the Euros, and we know women are targeted more online than men. Social media firms will not tackle this because their business model encourages harmful content. Nasty content attracts more traffic; more traffic brings more advertising revenue; and more revenue means bigger profits. Legislation is necessary to make the social media firms act. However, this Bill will simply gather dust if Ofcom and the police remain underfunded. The “polluter pays” principle—that is, securing funding through a levy on the platforms—would be much fairer than taxpayers picking up the bill for corporate failures.

I cherish anonymity for whistleblowers and domestic violence victims—it is vital—but when it is used as a cloak to harm others, it should be challenged. The Government’s halfway measure allows users to choose to block anonymous posts by verifying their own identity. That ignores police advice not to block abusive accounts, as those accounts help to identify genuine threats to individuals, and it ignores the danger of giving platforms the power to verify identities. We should think about the Cambridge Analytica scandal. Surely a third party with experience in unique identification should carry out checks on users. Then we all remain anonymous to platforms, but can be traced by law enforcement if found guilty of harmful abuse. We can then name and shame offenders.

On director liability, fines against platforms become a business cost and will not change behaviour, so personal liability is a powerful deterrent. However, enforcing this liability only when a platform fails to supply information to Ofcom is feeble. Directors must be made liable for breaching safety duties.

Finally, as others have said, most regulations apply only to category 1 platforms. Search engines fall through the cracks; BitChute, Gab, 4chan—all escape, but as we saw in the attacks on Pittsburgh’s synagogue and Christchurch’s mosque, all these platforms helped to foster those events. Regulation must be based on risk, not size. Safety should be embedded in any innovative products, so concern about over-regulating innovation is misplaced. This is the beginning of a generational change. I am grateful to Ministers, because I do think they have listened. If they continue to listen, we can make Britain the safest place online.

20:26
John Whittingdale Portrait Mr John Whittingdale (Maldon) (Con)
- Hansard - - - Excerpts

This Bill is a groundbreaking piece of legislation, and we are one of the first countries to attempt to bring in controls over content online. I therefore share the view of the hon. Member for Manchester Central (Lucy Powell) that it is a great pity that its Second Reading was scheduled for a day when there is so much other business.

The Bill has been a long time in the preparation. I can remember chairing an inquiry of the Culture, Media and Sport Committee in 2008 on the subject of harmful content online. Since then, we have had a Green Paper, a White Paper, a consultation, a draft Bill, a Joint Committee, and several more Select Committee inquiries. It is important that we get this right, and the Bill has grown steadily, as the Secretary of State outlined. I do not need to add to the reasons why it is important that we control content and protect vulnerable people from online content that is harmful to them.

There are two areas where I want to express a word of caution. First, as the Under-Secretary, my hon. Friend the Member for Croydon South (Chris Philp), is very much aware, the Government have an ambition to make the United Kingdom the tech capital of the world. We have been incredibly successful in attracting investment. He will know better than I that the tech industry in Britain is now worth over $1 trillion, and that we have over 100 unicorns, but the Bill creates uncertainty, mainly because so much is subject to secondary legislation and not spelled out in detail in the Bill. This will stifle innovation and growth.

It is fairly obvious which are the main companies that will fall into the category 1 definition. We are told that there may be some 15 to 20. Some of them are certainly obvious. However, I share the view that this needs to be determined more by risk than by reach. A company does not necessarily pose a significant risk simply because it is large. Companies such as Tripadvisor, eBay and Airbnb, which, on the size criteria, might fall within scope of category 1, should not do so. I hope that the Secretary of State and the Minister can say more about the precise definitions that will determine categories. This is more serious for the category 2 companies; it is estimated that some 25,000 may fall within scope. It is not clear precisely what the obligations on them will be, and that too is causing a degree of uncertainty. It is also unclear whether some parts of a large company with several businesses, such as Amazon, would be in category 1 or category 2, or what would happen if companies grow. Could they, for instance, be re-categorised from 1 to 2? These concerns are being raised by the tech industry, and I hope that my hon. Friend the Minister will continue to talk to techUK, to allay those fears.

The second issue, as has been rightly identified, is the effect on freedom of speech. As has been described, tech platforms already exercise censorship. At the moment, they exercise their own judgment as to what is permissible and what is not, and we have had examples such as YouTube taking down the talkRadio channel. I spent a great deal of time talking to the press and media about the special protections that journalism needs, and I welcome the progress that has been made in the Bill. It is excellent that journalistic content will be put in a special category. I repeat the question asked by my right hon. Friend the Member for Ashford (Damian Green). The Secretary of State made some very welcome comments on, I think, “This Morning” about the introduction of an additional protection so that, if a journalist’s shared content were removed from an online platform, they would need to be informed and able to appeal. That may require additional amendments to the Bill, so perhaps the Minister could say when we are likely to see those.

There is also the concern raised by the periodical publishers that specialist magazines appear to be outside the protection of journalistic content. I hope that that can be addressed, because there are publications that deserve the same level of protection.

There is a wider concern about freedom of speech. The definition “legal but harmful” raises real concerns, particularly given that it is left open to subsequent secondary legislation to set out exactly what the categories will be. There are also widespread concerns that we need to avoid, at all costs, setting a precedent that may be used by others who are more keen to censor discussion online. In particular, clause 103(2)(b) relates to messaging services and can require Ofcom to use accredited technology to identify CSEA material. The Minister will be aware that that matter is also causing concern.

20:31
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

In the interest of time, I will just pose a number of questions, which I hope the Minister might address in summing up. The first is about the scope of the Bill. The Joint Committee of which I was a member recommended that the age-appropriate design code, which is very effectively used by the Information Commissioner, be used as a benchmark in the Bill, so that any services accessed or likely to be accessed by children are regulated for safety. I do not understand why the Government rejected that suggestion, and I would be pleased to hear from the Minister why they did so.

Secondly, the Bill delegates lots of detail to statutory instruments, codes of practice from the regulator, or later decisions by the Secretary of State. Parliament must see that detail before the Bill becomes an Act. Will the Minister commit to those delegated decisions being published before the Bill becomes an Act? Could he explain why the codes of practice are not being set as mandatory? I do not understand why codes of practice, much of the detail of which the regulator is being asked to set, will not be made mandatory for businesses. How can minimum standards for age or identity verification be imposed if those codes of practice are not made mandatory? Perhaps the Minister could explain.

Many users across the country will want to ensure that their complaints are dealt with effectively. We recommended an ombudsman service that dealt with complaints that were exhausted through a complaints system at the regulated companies, but the Government rejected it. Please could the Minister explain why?

I was pleased that the Government accepted the concept of the ability for a super-complaint to be brought on behalf of groups of users, but the decision as to who will be able a bring a super-complaint has been deferred, subject to a decision by the Secretary of State. Why, and when will that decision be taken? If the Minister could allude to who they might be, I am sure that would be welcome.

Lastly, there is a number of exemptions and more work to be done, which leaves significant holes in the legislation. There is much more work to be done on clauses 5, 6 and 50—on democratic importance, journalism and the definition of journalism, on the exemptions for news publishers, and on disinformation, which is mentioned only once in the entire Bill. I and many others recognise that these are not easy issues, but they should be considered fully before legislation is proposed that has gaping holes for people who want to get around it, and for those who wish to test the parameters of this law in the courts, probably for many years. All of us, on a cross-party basis in this House, support the Government’s endeavours to make it safe for children and others to be online. We want the legislation to be implemented as quickly as possible and to be as effective as possible, but there are significant concerns that it will be jammed up in the judicial system, where this House is unacceptably giving judges the job of fleshing out the definition of what many of the important exemptions will mean in practice.

The idea that the Secretary of State has the power to intervene with the independent regulator and tell it what it should or should not do obviously undermines the idea of an independent regulator. While Ministers might give assurances to this House that the power will not be abused, I believe that other countries, whether China, Russia, Turkey or anywhere else, will say, “Look at Great Britain. It thinks this is an appropriate thing to do. We’re going to follow the golden precedent set by the UK in legislating on these issues and give our Ministers the ability to decide what online content should be taken down.” That seems a dangerous precedent.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

The Minister is shaking his head, but I can tell him that the legislation does do that, because we looked at this and took evidence on it. The Secretary of State would be able to tell the regulator that content should be “legal but harmful” and therefore should be removed as part of its systems design online. We also heard that the ability to do that at speed is very restricted and therefore the power is ineffective in the first place. Therefore, the Government should evidently change their position on that. I do not understand why, in the face of evidence from pretty much every stakeholder, the Government agree that that is an appropriate use of power or why Parliament would vote that through.

I look forward to the Minister giving his answers to those questions, in the hope that, as the Bill proceeds through the House, it can be tidied up and made tighter and more effective, to protect children and adults online in this country.

20:35
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

This is an incredibly important Bill. It has huge cross-party support and was subject to scrutiny by the Joint Committee, which produced a unanimous report, which shows the widespread feeling in both Houses and on both sides of this Chamber that we should legislate. I do feel, though, that I should respond to some of the remarks of the shadow Secretary of State, the hon. Member for Manchester Central (Lucy Powell), on the Joint Committee report.

I agree with the hon. Member that, unless this legislation covers the systems of social media companies as well as the content hosted, it will not be effective, but it is my belief that it does that. Throughout the evidence that the Committee took, including from Ofcom and not just the Government, it was stated to us very clearly that the systems of social media companies are within scope and that, in preparing the risk registers for the companies, Ofcom can look at risks. For Facebook, that could include the fact that the news feed recommends content to users, while for someone on TikTok using For You, it could be the fact that the company is selecting—algorithmically ranking—content that someone might like. That could include, for a teenage girl, content that promoted self-harm that was being actively recommended by the company’s systems, or, as Frances Haugen set out, extremist content and hate speech being actively promoted and recommended by the systems.

That would be in scope. The algorithms are within scope, and part of Parliament job’s will be to ensure on an ongoing basis that Ofcom is using its powers to audit the companies in that way, to gain access to information in that way, and to say that the active promotion of regulated content by a social media company is an offence. In passing this Bill, we expect that that will be fully in scope. If the legislation placed no obligation on a company to proactively identify any copies of content that it had judged should not be there and had taken down, we would have a very ineffective system. In effect, we would have what Facebook does to assess content today. If that was effective, we would not need this legislation, but it is woefully ineffective, so the algorithms and the systems are in scope. The Bill gives Ofcom the power to regulate on that basis, and we have to ensure that it does that in preparing the risk registers.

Following what my Joint Committee colleague, the hon. Member for Bristol North West (Darren Jones), said, the point about the codes of practice is really important. The regulator sets the codes of practice for companies to follow. The Government set out in their response to the Joint Committee report that the regulator can tell companies if their response is not adequate. If an area of risk has been identified where the company has to create policies to address that risk and the response is not good enough, the regulator can still find the company in breach. I would welcome it if the Minister wished to say more about that, either today or as the Bill goes through the House, because it is really important. The response of a company to a request from the regulator, having identified a risk on its platforms, cannot be: “Oh, sorry, we don’t have a policy on that.” It has to be able to set those policies. We have to go beyond just enforcing the terms of service that companies have created for themselves. Making sure they do what they say they are going to do is really important, as the Secretary of State said, but we should be able to push them to go further.

I agree, though, with the hon. Member for Manchester Central and other hon. Members about regulation being based on risk and not just size. In reality, Ofcom will have to make judgment calls on smaller sites that are posing a huge risk or a new risk that has been identified.

The regulator will have the power to regulate Metaverse and VR platforms. Anything that is a user-to-user service is already in scope of the legislation. The challenge for the regulator will be in moderating conversations between two people in a virtual room, which is much harder than when people are posting text-based content. The technology will have to adapt to do that, but we should start that journey based on the fact that that is already in scope.

Finally, on the much used expression “legal but harmful”, I am pleased the Government took one of our big recommendations, which is to write more offences clearly into the Bill, so it is clear what is actually being regulated—so promotion of self-harm is regulated content and hate speech is part of the regulated content. The job of the regulator then is to set the threshold where intervention should come and I think that should be based on case law. On many of these issues, such as the abuse of the England footballers after the final of the European championships, people have been sentenced in court for what they did. That creates good guidance and a good baseline for what hate speech is in that context and where we would expect intervention. I think it would be much easier for the Bill, the service users that are regulated and the people who post content, to know what the offences are and where the regulatory standard is. Rather than describing those things as “legal but harmful”, we should describe them as what they are, which is regulated offences based on existing offences in law.

The Government made an important step in responding to say that the Government, in seeking amendment to the codes of practice that bring new offences within scope of these priority areas of harm, should have to go through an affirmative process in both Houses. That is really important. Ultimately, the regulation should be based on our laws and changes should be based on decisions taken in this House.

None Portrait Several hon. Members rose—
- Hansard -

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. After the next speaker, the time limit will be reduced to four minutes.

20:40
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker.

I want to focus on how people actually use the internet, particularly how young people actually use the internet. I feel, as was suggested in one of the comments in questions earlier, that this Bill and some of the discussion around it misses some of the point and some of the actual ways in which particularly young people use the internet.

We have not mentioned, or I have not heard anyone mention, Discord. I have not heard anyone mention Twitch. I have not heard people talking about how people interact on Fortnite. A significant number of young people use Fortnite to interact with their friends. That is the way they speak to their friends. I do not know if the Minister is aware of this, but you can only change the parental controls on Fortnite to stop your children speaking to everybody; you cannot stop them speaking to everybody but their friends. There are no parental controls on a lot of these sites that parents can adequately utilise. They only have this heavy-handed business where they can ban their child entirely from doing something, or they are allowed to do everything. I think some bits are missed in this because it does not actually reflect the way young people use the internet.

In the girls’ attitude survey produced by Girlguiding, 71% of the 2,000 girls who were surveyed said that they had experienced harmful content while online. But one of the important things I also want to stress is that a quarter of LGBQ and disabled girls found online forums and spaces an important source of support. So we need to make sure that children and young people have the ability to access those sources of support. Whether that is on forums, or on Fortnite, Minecraft, Animal Crossing or whatever it is they happen to be speaking to their friends on, that is important and key in order for young people to continue to communicate. It has been especially important during the pandemic.

There is at this moment a major parenting knowledge gap. There is a generation of parents who have not grown up using the internet. I was one the first people to grow up using the internet and have kids; they are at the top end of primary school now. Once this generation of kids are adults, they will know how their children are behaving online and what the online world is like because they will have lived through it themselves. The current generation of parents has not. The current generation of parents has this knowledge gap.

I am finding that a lot of my kids’ friends have rules that I consider totally—totally—unacceptable and inappropriate because they do not match how kids actually use the internet and the interactions they are likely to have on there. I asked my kids what they thought was the most important thing, and they said the ability to choose what they see and what they do not see, and who they hear from and who they do not hear from. That was the most important thing to them.

That has been talked about in a lot of the information we have received—the requirement to look at algorithms and to opt in to being served with those algorithms, rather than having an opt-out, as we do with Facebook. Facebook says, “Are you sure you don’t want to see this content any more?” Well, yes, I have clicked that I do not want to see it—of course I do not want to see it any more. Of course I would like to see the things my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) posts and all of the replies he sends to people—I want that to pop up with my notifications—but I should have to choose to do that.

Kids feel like that as well—my kids, and kids up and down the country—because, as has been talked about, once you get into these cycles of seeing inappropriate, harmful, damaging content, you are more likely to be served with more and more of that content. At the very first moment people should be able to say, “Hang on, I don’t want to see any of this”, and when they sign up to a site they should immediately be able to say, “No, I don’t want to see any of this. All I want to do is speak to the people I know or have sent a friend request to and accepted a send request from.” We need to ensure that there are enough safeguards like that in place for children and young people and their parents to be able to make those choices in the knowledge and understanding of how these services will actually be used, rather than MPs who do not necessarily use these services making these decisions. We need to have that flexibility.

My final point is that the internet is moving and changing. Twenty years ago I was going to LAN parties and meeting people I knew from online games. That is still happening today and we are only now getting the legislation here and catching up. It has taken that long for us to get here so this legislation must be fit for the future. It must be flexible enough to work with the new technologies, social media and gaming platforms that are coming through.

20:45
Andrew Percy Portrait Andrew Percy (Brigg and Goole) (Con)
- Hansard - - - Excerpts

I, too, regret the short time we have to debate this important Bill this evening. This is much-needed legislation and I agree with many of the comments already made.

These platforms have been warned over the years to take action yet have failed to do so. Their online platforms have remained a safe space for racism, holocaust denial, homophobia, conspiracy theories and general bullying. One of the best things I ever did for my mental health was to leave Twitter, but for many young people that is not an option as it cuts them off from access to their friends and much of what is their society. So I am proud that the Government are taking action on this but, as the Minister knows from my meetings with him alongside the Antisemitism Policy Trust, there are ways in which I think the Bill can be improved.

First, on small, high-harm platforms, I pay tribute to the Antisemitism Policy Trust, which has been leading the charge. As the hon. Member for Aberdeen North (Kirsty Blackman) said, everybody knows Facebook, Twitter and YouTube but few people are aware of a lot of the smaller platforms such as BitChute, 8kun—previously 8chan—or Minds. These small platforms are a haven for white supremacists, incels, conspiracy theorists and antisemites; it is where they gather, converse and share and spew their hate.

An example of that is a post from the so-called anti-Jewish meme repository on the platform Gab which showed a picture of goblins, in this instance the usual grotesque representation of those age-old Jewish physical stereotypes, alongside the phrase, “Are you ready to die in another Jewish war, Goyim?” That is the sort of stuff that is on these small platforms, and it is not rare; we see it all over. Indeed, many of these small platforms exist purely to spew such hate, but at present, despite the many measures in the Bill that I support, these sites will be sifted by Ofcom into two major categories based on their size and functionality. I met the Minister to discuss this point recently.

The Government have not so far been enthusiastic about risk being a determinant factor for fear that too many of the small platforms would be drawn into scope. That is why I hope that as this Bill progresses the Minister will consider a small amendment to enable Ofcom to have powers to draw the small but high-harm platforms, based on its assessments—the so-called super-complaints that we have heard about or other means— into the category 1 status. That would add a regulatory oversight and burden on those platforms. This is all about putting pressure on them—requiring them to go through more hurdles to frustrate their business model of hate, and making it as uncomfortable as possible for them. I hope the Minister will look at that as the Bill progresses.

I am very short of time but I also want to raise the issue of search, which the Minister knows I have raised previously. We in the all-party group against antisemitism found examples in Alexa and other voice-activated search platforms where the responses that come back are deeply offensive and racist. I understand that the relationship with the user in entering into a search is different from having an account with a particular social media platform, but these search engines are providing access to all sorts of grotesque racist and misogynistic content and I hope we can look at that as the Bill progresses.

20:49
Luke Pollard Portrait Luke Pollard (Plymouth, Sutton and Devonport) (Lab/Co-op)
- Hansard - - - Excerpts

I welcome the Bill. It is an important step forward, and it is because I welcome it that I want to see it strengthened. It seems to be an opportunity for us to get this right and in particular to learn lessons from where we have got it wrong in the past. I want to raise two different types of culture. The first is incel culture, and I would like to relate that to the experience that we had in Keyham, with the massing shooting in Plymouth last year, and the second is the consequences of being Instafamous.

It is just over six months since the tragic shooting in Keyham in which we lost five members of our community. The community feels incredibly strongly that we want to learn the lessons, no matter how painful or difficult they are, to ensure that something like this never happens again. We are making progress, working with the Home Office on gun law changes, in particular on linking medical records and gun certificates. One part is incredibly difficult, and that is addressing incel culture, which has been mentioned from the Front Bench by my hon. Friend the Member for Pontypridd (Alex Davies-Jones) and by the hon. Member for Brigg and Goole (Andrew Percy). It sits in the toxic underbelly of our internet and in many cases, it sits on those smaller platforms to which this Bill will not extend the full obligations. I mention that because it results in real-world experiences.

I cannot allocate responsibility for what happened in the Keyham shooting because the inquest is still under way and the police investigations are ongoing, but it is clear that online radicalisation contributed to it, and many of the sites that are referenced as smaller sites that will not be covered by the legislation contributed perhaps in part to the online radicalisation.

When incel culture leads to violence it is not domestic terrorism; it falls between the stools. It must not fall between the stools of this legislation, so I would be grateful if the Minister agreed to meet me and members of the Keyham community to understand how his proposals relate to the learnings that we are coming out with in Keyham to make sure that nothing like this can ever happen again. With the online radicalisation of our young men in particular, it is really important that we understand where the rescue routes are. This is not just about the legislation; it needs to be about how we rescue people from the routes that they are going down. I would like to understand from the Minister how we can ensure that there are rescue routes; that schools, social services and mental health providers can understand how to rescue people from incel culture and the online radicalisation of incel culture as well as US gun culture—the glorification of guns and the misogynistic culture that exists in this space.

The second point about culture is an important one about how we learn from young people. Plymouth is a brilliant place. It is home to both GOD TV—a global evangelical broadcaster—and to many porn production companies. It is quite an eclectic, creative setting. We need to look at how we can learn from the culture of being Instafamous. Instafamous is something that many of our young people look at from an early age. They look at Body Beautifuls, Perfect Smiles—an existence that is out of reach for many people. In many cases they are viewing the creation of online pornography via sites such as OnlyFans as a natural and logical extension to being Instafamous. It is something that, sadly, can attract a huge amount of income. So young people taking their kit off at an early age, especially in their teenage years, can produce high earnings. I want to see those big companies challenged not to serve links on Instagram profiles to OnlyFans content for under-18s. That sits in a grey area of the Bill. I would be grateful if the Minister looked at how we can have that as a serious setting so that we can challenge that culture and help build understanding about how Instafamous must mean consent and protection.

20:53
Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

Overall, I very much welcome the Bill. It has been a long time coming, but none of us here would disagree that we need to protect our children, certainly from pornography and all sorts of harassment and awful things that are on the internet and online communications platforms. There is no argument or pushback there at all. I welcome the age verification side of things. We all welcome that.

The repeal of the Malicious Communications Act 1988 is a good move. The adjustment of a couple of sections of the Communications Act 2003 is also a really good, positive step, and I am glad that the Bill is before us now. I think pretty much everyone here would agree with the principles of the Bill, and I thank the Government for getting there eventually and introducing it. However, as chair of the freedom of speech all-party parliamentary group I need to say a few words and express a few concerns about some of the detail and some of the areas where the Bill could perhaps be improved still further.

The first point relates to the requirement that social media have regard to freedom of speech. It is very easy, with all the concerns we have—I have them too—to push too hard and say that social media companies should clamp down immediately on anything that could be even slightly harmful, even if it is uncertain what “harmful” actually means. We must not to give them the powers or the incentive through financial penalties to shut down freedom of speech just in case something is seen to be harmful by somebody. As the Bill progresses, therefore, it would be interesting to look at whether there is an area where we can tighten up rights and powers on freedom of speech.

Secondly, there is the huge issue—one or two other Members have raised it—of definitions. Clearly, if we say that something that is illegal should not be there and should disappear, of course we would all agree with that. If we say that something that is harmful should not be there, should not be transmitted and should not be amplified, we start to get into difficult territory, because what is harmful for one person may not be harmful for another. So, again, we need to take a little more of a look at what we are talking about there. I am often called “Tory scum” online. I am thick-skinned; I can handle it. It sometimes happens in the Chamber here—[Laughter.]—but I am thick-skinned and I can handle it. So, what if there was an option online for me to say, “You know what? I am relaxed about seeing some content that might be a bit distasteful for others. I am okay seeing it and hearing it.”? In academic discourse in particular, it is really important to hear the other side of the argument, the other side of a discussion, the other side of a debate. Out of context, one phrase or argument might be seen to be really harmful to a certain group within society. I will just flag the trans debate. Even the mention of the word trans or the words male and female can really ignite, hurt and harm. We could even argue that it is severe harm. Therefore, we need to be very careful about the definitions we are working towards.

Finally, the key principle is that we should ensure that adults who have agency can make decisions for themselves. I hope social media companies can choose not to remove content entirely or amplify content, but to flag content so that grown-ups with agency like us, like a lot of the population, can choose to opt in or to opt out.

20:57
Carla Lockhart Portrait Carla Lockhart (Upper Bann) (DUP)
- Hansard - - - Excerpts

While long overdue, I welcome the Bill and welcome the fact that it goes some way to addressing some of the concerns previously raised in this House. I thank the Minister for his engagement and the manner in which the Government have listened, particularly on the issue of anonymity. While it is not perfect, we will continue to press for the cloak of anonymity, which allows faceless trolls to abuse and cause harm, to be removed.

In building the Bill, a logical cornerstone would be that what is illegal offline—on the street, in the workplace and in the schoolyard—is also illegal online. The level of abuse I have received at times on social media would certainly be a matter for the police if it happened in person. It is wrong that people can get away with it online. However, there are dangers to our right to free speech around regulating content that is legal but deemed harmful to adults. The Bill allows what is legal but harmful to adults to be decided by the Secretary of State. Whatever is included in that category now could be easily expanded in future by regulations, which we all know means limited parliamentary scrutiny. As responsible legislators, we must reflect on how that power could be misused in the future. It could be a tool for repressive censorship and that is surely something neither the Government nor this House would wish to see in a land where freedom of speech is such a fundamental part of what and who we are. Without robust free speech protections, all the weight of the duties on content that is legal but harmful to adults will be pushing in one direction, and sadly, that is censorship. I urge the Government to address that in the Bill.

We also need to look at the weakness of the Bill in relation to the protection, particularly for children and young people, from pornography. It is welcome that since the publication of the draft Bill, the Government have listened to concerns by introducing part 5. In eight days, it will be the fifth anniversary of the Digital Economy Act 2017 receiving Royal Assent. This Government took the decision not to implement part 3 of that Act. Those of us in the House who support age verification restrictions being placed on pornographic content are justifiably hesitant, wondering whether the Government will let children down again.

It could be 2025 before children are protected through age verification. Even if the Bill becomes law, there is still no certainty that the Government will commence the provisions. It simply cannot be left to the Secretary of State in 2025 to move secondary legislation to give effect to age verification. A commencement clause needs to be placed in the Bill. Children deserve the right to know that this Government will act for them this time.

Furthermore, the Bill needs to be consistent in how it deals with pornography across parts 3 and 5. Age verification is a simple concept. If a website, part of a website or social media platform hosts or provides pornographic content, a person’s age should be verified before access. If a child went into a newsagents to attempt to buy a pornographic magazine, they would be challenged by the shopkeeper. This goes back to the cornerstone of this issue: illegal offline should mean illegal online. The concept may be simple but the Bill, as drafted, adds unnecessary complexities. I ask the Minister to act and make parts 3 and 5 similar. We should also give Ofcom more power when it is implementing the Bill.

21:01
Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I had the great privilege of sitting on the Joint Committee on the draft Bill before Christmas and working with the Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), fantastic Members from across both Houses and amazing witnesses.

We heard repeated stories of platforms profiting from pain and prejudice. One story that really affected me was that of Zach Eagling, a heroic young boy who has cerebral palsy and epilepsy and who was targeted with flashing images by cruel trolls to trigger seizures. Those seizures have been triggered for other people with epilepsy, affecting their lives and risking not just harm, but potentially death, depending on their situation. That is why I and my hon. Friend the Member for Stourbridge (Suzanne Webb)—and all members of the Joint Committee, actually, because this was in our report—backed Zach’s law.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Ten-year-old Zach is a child in my constituency who has, as the hon. Member said, cerebral palsy and epilepsy, and he has been subjected to horrendous online abuse. I hope that the Minister can provide clarity tonight and confirm that Zach’s law—which shows that not just psychological harm and distress, but physical harm can be created as a result of online abuse and trolling—will be covered in the Bill.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

My understanding—hopefully this will be confirmed from the Dispatch Box—is that Zach’s law will be covered by clause 150 in part 10, on communications offences, but I urge the Ministry of Justice to firm that up further.

One thing that really came through for me was the role of algorithms. The only analogy that I can find in the real world for the danger of algorithms is narcotics. This is about organisations that focused on and targeted harmful content to people to get them to be more addicted to harm and to harmful content. By doing that, they numbed the senses of people who were using technology and social media, so that they engaged in practices that did them harm, turning them against not only others, but themselves. We heard awful stories about people doing such things as barcoding—about young girls cutting themselves—which was the most vile thing to hear, especially as a parent myself. There was also the idea that it was okay to be abusive to other people and the fact that it became normalised to hurt oneself, including in ways that can be undoable in future.

That leads on to a point about numbing the senses. I am really pleased that in debating the Bill today we have talked about the metaverse, because the metaverse is not just some random technology that we might talk about; it is about numbing the senses. It is about people putting on virtual reality headsets and living in a world that is not reality, even if it is for a matter of minutes or hours. As we look at these technologies and at virtual reality, my concern is that children and young people will be encouraged to spend more time in worlds that are not real and that could include more harmful content. Such worlds are increasingly accurate in their reality, in the impact that they can have and in their capability for user-to-user engagement.

I therefore think that although at the moment the Bill includes Meta and the metaverse, we need to look at it almost as a tech platform in its own right. We will not get everything right at first; I fully support the Bill as it stands, but as we move forward we will need to continue to improve it, test it and adapt it as new technologies come out. That is why I very much support the idea of a continuing Joint Committee specifically on online safety, so that as time goes by the issues can be scrutinised and we can look at whether Ofcom is delivering in its role. Ultimately, we need to use the Bill as a starting point to prevent harm now and for decades to come.

21:05
Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- Hansard - - - Excerpts

I welcome the Bill, which is necessary and overdue, but I would like to raise two issues: how the Bill can tackle suicide and self-harm prevention, and mental health around body image for young people.

First, all suicide and self-harm content should be addressed across all platforms, regardless of size: it is not just the larger platforms that should be considered. The requirement imposed on category 1 platforms relating to legal but harmful suicide and self-harm content should be extended to all platforms, as many colleagues have said. There is a real concern that users will turn from the larger to the smaller platforms, so the issue needs to be addressed. Will the Minister confirm that even smaller platforms will be asked at the start to do an assessment of the risk they pose?

Secondly, the Secretary of State referred to secondary legislation, which will be necessary to identify legal but harmful suicide and self-harm content as a real priority for action. It would be really helpful if we could see that before the legislation is finally passed: it is a key issue and must be an urgent area of work.

Thirdly, I wonder whether the Government will look again at the Law Commission’s proposal that a new offence of encouraging or assisting serious self-harm be created, and that the Bill should make assisting self-harm a priority issue with respect to illegal content. Will the Minister look again at that proposal as the Bill progresses?

I also want to speak about damage to body image, particularly in relation to young people. All of us want to look our best on social media. Young people in particular face a real barrage of digitally enhanced and in many cases unrealistic images that can have a negative effect on body image. Research by the Mental Health Foundation shows that harmful material that damages body image can have a real negative effect on young people’s mental health. As other hon. Members have said, and as most of us know from our own experience, many of the images that we see on social media are driven by algorithms that can amplify the harm to young people. That is particularly concerning as an issue associated with the possible development of eating disorders and mental health conditions.

The Bill does include some provision on algorithms, but more needs to be done to protect our young people from that damage. I encourage the Government to consider amendments that would give more control over new algorithmic content and ensure that the safest settings are the default settings. Users should be given more control over the kind of advertising that they see and receive, to avoid excessive advertising showing perfect bodies. The Government should commit themselves to recognising material that damages body image as a serious form of harm.

There are many more detailed issues that I would have liked to raise tonight, but let me end by saying that we need to give serious consideration to ways of reducing the incidence of suicides and self-harm.

None Portrait Several hon. Members rose—
- Hansard -

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. I am reluctant to reduce the time limit, but I am receiving appeals for me to try to get more people in, so I will reduce it to three minutes. However, not everyone will have a chance to speak this evening.

21:09
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I congratulate the ministerial team and the army of fantastic officials who have brought this enormous and groundbreaking Bill to its current stage. It is one of the most important pieces of legislation that we will be dealing with. No country has attempted to regulate the internet so comprehensively as we have, and I welcome all the improvements that have been made to bring the Bill to this point. Those people have been extremely brave, and they have listened. There are widely competing interests at stake here, and the navigation of the Bill to a position where it has already achieved a degree of consensus is quite remarkable.

The pressure is on now, not least because we have all got into the habit of describing the Bill as the cavalry coming over the hill to solve all the ills of the online world. It is worth acknowledging from the outset that it will not be the silver bullet or the panacea for all the challenges that we face online. The point is, however, that it needs to be the best possible starting point, the groundworks to face down both the current threats and, more important, the likely challenges of the future. We all have a huge responsibility to work collaboratively, and not to let this process be derailed by side issues or clouded by party politics. Never has the phrase “not letting the perfect be the enemy of the good” been more appropriate. So much will be at risk if we do not seize the opportunity to make progress.

As the Secretary of State pointed out, the irony is that this vast and complex legislation is completely unnecessary. Search engines and social media platforms already have the ability to reduce the risks of the online world if they want to, and we have seen examples of that. However, while the bottom line remains their priority—while these precious algorithms remain so protected—the harms that are caused will never be tackled. With that in mind, I am more convinced than ever of the need for platforms to be held to account and for Ofcom to be given the powers to ensure that they are.

Inevitably, we will need to spend the next few weeks and months debating the various facets of this issue, but today I want to underline the bigger picture. It has always been an overarching theme that protecting children must be a top priority. One of the toughest meetings that I had as Digital Minister was with Ian Russell, whose 14-year-old daughter Molly took her own life after reading material promoting suicide and self-harm on Instagram. That is a conversation that brings a chill to the heart of any parent. Children are so often the victims of online harms. During lockdown, 47% of children said they had seen content that they wished they had not seen. Over a month-long period, the Internet Watch Foundation blocked at least 8.8 million attempts by UK internet users to access videos and images of children suffering sexual abuse.

There is so much at stake here, and we need to work together to ensure that the Bill is the very best that it can possibly be.

21:13
Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- Hansard - - - Excerpts

Obviously I, and my party, support the thrust of the Bill. The Government have been talking about this since 2018, so clearly time is of the essence.

Members have referred repeatedly to the slight vagueness of the definitions currently in the Bill—words such as “harms”, for instance—so I wanted to examine this from a “first principles” point of view. In another place, and almost in another life, and for four long years—perhaps as a punishment brief—I was made the Chairman of Subordinate Legislation Committee in the Scottish Parliament, so without bragging terribly much, I can say that there is nothing I do not know about affirmative and negative resolutions and everything to do with statutory instruments. You could call me a statutory instrument wonk. What I do know, and I do not think it is very different from discussion here, is that instruments come and go; they are not on the face of a Bill, because they are secondary legislation; and, by and large, ordinary, run-of-the-mill Members of Parliament do not take a huge amount of interest in them. The fact is, however, that the powers that will be granted to the Secretary of State to deliver definitions by means of subordinate legislation—statutory instruments—concern me slightly.

Reference has been made to how unfortunate it would be if the Secretary of State could tell the regulator what the regulator was or was not to do, and to the fact that other countries will look at what we do and, hopefully, see it as an example of how things should be done on a worldwide basis. Rightly or wrongly, we give ourselves the name of the mother of Parliaments. The concept of freedom of speech is incredibly important to the way we do things in this place and as a country. When it comes to the definition of what is bad, what is good, what should be online and what should not, I would feel happier if I could see that all 650 Members of Parliament actually understood and owned those definitions, because that is fundamental to the concept of freedom of speech. I look forward to seeing what comes back, and I have no reason to think that the Government are unsympathetic to the points that I am making. This is about getting the balance right.

Finally, in the short time available, I want to make two last points. My party is very keen on end-to-end encryption, and I need reassurance that that remains a possibility. Secondly, on the rules governing what is right and what is wrong for the press, the seven criteria would, as I read them, still allow a channel that I am not keen on, the Russian propaganda channel Russia Today, to broadcast, and allow my former colleague, the former First Minister of Scotland—this is no reflection on the Scottish National party—to broadcast his nonsense. That has now been banned, but the rules, as I see them, would allow Russia Today to broadcast.

21:15
Saqib Bhatti Portrait Saqib Bhatti (Meriden) (Con)
- Hansard - - - Excerpts

I am a great believer in the good that social media has done over the last few decades. It has transformed the way we interact, share ideas and stay connected. Social media has allowed a global conversation about global challenges such as climate change, poverty and even the conflict that we are witnessing in Ukraine. However, there is a dark side to social media, and I would be surprised if there were any Member of this House who had not experienced some form of it. The online world has become like the wild west: anything goes. Indeed, it was just last year when the whole country was gripped by the success of our football team in the Euros, and as I sadly watched us lose another penalty shoot-out, I turned to my wife and said, “You know what’s going to happen now, don’t you?” And it did. The three players who missed penalties, all young black men, were subjected to disgusting racist abuse. Monkey emojis were used to taunt them, and were not taken down because the Instagram algorithm did not deem that to be racism. Abuse on Twitter was rife, and the scale of it was so large that it restarted a national conversation, which I am sad to say we have had many times before.

On the back of that, I, along with 50 of my colleagues, wrote to the major social media companies: Reddit, Facebook, Twitter, Snapchat and TikTok. We asked for three things: that all accounts be verified; that the algorithm be adjusted with human interaction to account for differences in languages; and that there be a “three strikes and you’re out” policy for serial offenders, so that they knew that they would not be allowed to get away with abuse. Unfortunately, not all the companies responded, which shows how much respect they have for our democratic processes and for the moral duty to do the right thing. Those that did respond took long enough to do so, and took the view that they were already doing enough. Clearly, anyone can go on social media today and see that that is not true. It is not that the companies are burying their head in the sand; it is just not very profitable for them to make a change. If they had the will to do so, they certainly have the skill, innovative ability and resources to make it happen.

I fully accept that, in this legislation, the Government have taken a different approach, and there are clearly different ways to skin this cat. The 10% of turnover for fines, the clarity on what is allowed in companies’ terms and conditions, and effective enforcement may well draw a clear line in the sand. I call on the social media companies to heed the message sent by 50 of my colleagues, and to once again recognise their moral duty to be positive and good players in society. We have an opportunity today to set a standard, so that when an aspiring young boy or girl wants to be in the public eye, whether as an athlete, a media star or a politician, they will no longer think that being abused online is an inevitable consequence of that choice.

21:18
Sharon Hodgson Portrait Mrs Sharon Hodgson (Washington and Sunderland West) (Lab)
- Hansard - - - Excerpts

I speak in this debate as chair of the all-party parliamentary group on ticket abuse, which I set up over 10 years ago. The APPG shines a light on ticket abuse and campaigns to protect fans who are purchasing event tickets from being scammed and ripped off, often by the large-scale ticket touts that dominate resale sites such as Viagogo and StubHub. The APPG works with experts in the field such as FanFair Alliance, a music industry campaign, and the Iridium Consultancy to tackle industrial-scale ticket touting. I hope that when this legislation is reviewed in Committee, those organisations will be called on to share their expertise in this area.

Sadly, online ticket fraud is absolutely rife. Despite some regulatory and legislative improvements, not least in the Consumer Rights Act 2015, too many fans are still being scammed on a regular basis. The Bill, as it stands, includes a major loophole that means people will not be properly protected from online fraud. Search engines such as Google are not currently covered by the requirements on fraudulent advertising. A key issue in the ticketing market is how websites that allow fraudulent tickets to be sold often take out paid ads with Google that appear at the top of the search results. This gives the false impression to consumers that these sites are official ticket outlets. People mistakenly believe that only authorised ticket outlets can advertise on Google—people trust Google—and they are scammed as a result.

The Times reported last year that Google was taking advertising money from scam websites selling Premier League football tickets, even though the matches were taking place behind closed doors during lockdown—you couldn’t make it up. The Online Safety Bill needs to ensure that consumers are provided with much greater protection and that Google is forced to take greater responsibility for who it allows to advertise. If the Bill took action, online ticket fraud would be drastically reduced. With £2.3 billion lost to online fraud in the UK last year, it is very much needed.

It is also important to remember the human side of online fraud. Victims go through intense stress, as they are not only scammed out of their money but feel duped, stupid and humiliated. There cannot be a Member of this House who has not had to support a constituent devastated by online fraud. I have come across many stories, including one of an elderly couple who bought two tickets to see their favourite artist to celebrate their 70th wedding anniversary. When they arrived at the venue, they were turned away and told that they had been sold fake tickets. I have a lot more to say, Madam Deputy Speaker, but I think you get the drift.

21:21
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

For too long, the tech giants have been able to dismiss the harms they create for the people we represent because they do not take seriously their responsibility for how their products are designed and used, which is why this legislation is vital.

The Bill will start to change the destructive culture in the tech industry. We live simultaneously in online and offline worlds, and we expect the rules and the culture to be the same in both, but at the moment, they are not. When I visited the big tech companies in Silicon Valley as Secretary of State in 2014 to talk about online moderation, which was almost completely absent at that stage, and child abuse images, which were not regularly removed, I rapidly concluded that the only way to solve the problem and the cultural deficit I encountered would be to regulate. I think this Bill has its roots in those meetings, so I welcome it and the Government’s approach.

I am pleased to see that measures on many of the issues on which I have been campaigning in the years since 2014 have come to fruition in this Bill, but there is still room for improvement. I welcome the criminalisation of cyber-flashing, and I pay tribute to Grazia, Clare McGlynn and Bumble for all their work with me and many colleagues in this place.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

Scotland banned cyber-flashing in 2010, but that ban includes a motivation test, rather than just a consent test, so a staggering 95% of cyber-flashing goes unpunished. Does the right hon. Lady agree that we should not make the same mistake?

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

I will come on to that shortly, and the hon. Lady knows I agree with her. This is something the Government need to take seriously.

The second thing I support in this Bill is limiting anonymous online abuse. Again, I pay tribute to the Football Association, with which I have worked closely, Glitch, the Centenary Action Group, Compassion in Politics, Hope not Hate and Kick It Out. They have all done a tremendous job, working with many of us in this place, to get to this point.

Finally, I support preventing children from accessing pornography, although I echo what we heard earlier about it being three years too late. It is shameful that this measure was not enacted earlier.

The Minister knows that three demands are coming his way from me. We need to future-proof our approach to the law in this area. Tech moves quickly—quicker than the Government’s approach to legislation, which leaves us playing whack-a-mole. The devious methods of causing harm change rapidly, as do the motivations of perpetrators, to answer the point raised by the hon. Member for Bath (Wera Hobhouse). What stays the same is the lack of consent from victims, so will the Government please look at that as a way of future-proofing our law? A worrying example of that is deepfake technology that creates pornographic images of women. That is currently totally lawful. Nudification software is commercially available and uses images—only of women —to create nude images. I have already stated publicly that that should be banned. It has been in South Korea and Taiwan, yet our law is playing catch-up.

The second issue that the Government need to address is the fact that they are creating many more victims as a result of this Bill. We need to make sure that victim support is in place to augment the amazing work of organisations such as the Revenge Porn Helpline. Finally, to echo the point made by my hon. Friend the Member for Watford (Dean Russell), let me say that this is a complex area, as we are proving with every speech in this debate. I pay tribute to the Select Committee Chair, who is no longer in his place, and the Joint Committee Chair, but I believe that we need a joint standing committee to scrutinise the implementation of this Bill when it is enacted. This is a world-class piece of legislation to change culture, but we also need other countries to adopt a similar approach. A global approach is needed if this is to work to end the wild west.

21:25
Gavin Robinson Portrait Gavin Robinson (Belfast East) (DUP)
- Hansard - - - Excerpts

It is a pleasure to follow the right hon. Member for Basingstoke (Mrs Miller), and a number of contributions this evening chime with my view. My hon. Friend the Member for Upper Bann (Carla Lockhart) outlined our party’s broad support for the Bill; however, she and the hon. Members for Windsor (Adam Afriyie) and for Bristol North West (Darren Jones) all raised concerns that can be ironed out and worked upon as the Bill progresses, but that are worthy of reflection, from a principle perspective, at this stage. My hon. Friend rightly said that we should not ban online that which is legal offline. That issue is causing consternation and concern, and it needs to be reflected on and thought through.

There was a chink of light in the exchange between the Minister and the Chair of the Joint Committee, the hon. Member for Folkestone and Hythe (Damian Collins), who said that we want to, and should, be talking about regulating in the online domain those things that are offences offline. That is what we should be doing, not engaging in discussions about ill-defined or non-defined “legal but harmful” content. We do not know what that is. In this Bill, we are conferring significant power on the Secretary of State, not to decide that, but to bring that proposal forward through a mechanism that does not afford the greatest level of parliamentary scrutiny, as we know. This debate has been curtailed to two and a half hours, and a debate on a statutory instrument on what is legal but harmful will be 90 minutes long, and there will be no ability to amend that instrument.

There has been discussion about journalists. It is right that there should be protections for them, for democratic content and for politicians. However, article 10 of the Human Rights Act does not distinguish between the average Joe and somebody who is providing academic or journalistic content, so should we? Is that the right step? It is right that we provide protection for those individuals, but what about anyone else who wishes to enjoy freedom of expression in the online domain? It has been said that there is a right of appeal, and yes, there is—to an offshored company that marks its own homework and is satisfied with the action it has taken. But it will have removed the journalist or individual’s content, and they will have suffered the consequence, with no recourse. They cannot take a judicial review against such a company, and an individual will not be able to go to Ofcom either; it will not be interested unless a super entity or a super-class complaint is involved. There is no recourse here. Those are the sorts of issues we will have to grapple with. There are fines for the companies here, but what about recourse for the individual?

21:29
Robert Jenrick Portrait Robert Jenrick (Newark) (Con)
- Hansard - - - Excerpts

In the one minute you have given me to speak in this debate, let me make three brief points, Madam Deputy Speaker. First, I come to this Bill with concerns about its impact on freedom of speech. I am grateful for the reassurances I have received already, and will be following how we manage journalistic content, in particular, in order to protect that in the Bill.

Secondly, I am concerned about the Bill’s impact on the ability of us all to tackle the abuse of the power that social media companies have more broadly. The Bill does not contain measures to increase competition, to enable small businesses in this country to prosper and to ensure that the social media platforms do not crowd out existing businesses. I have been assured that a second Bill will follow this one and will tackle that issue, but in recent days I have heard reports in the press that that Bill will not go forward because of a lack parliamentary time. I would be grateful if the Minister could say when he responds to the debate that that Bill will proceed, because it is an extremely important issue.

21:30
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

Everyone wants to be safe online and everyone wants to keep their children safe online but, from grooming to religious radicalisation and from disinformation to cruel attacks on the vulnerable, the online world is far from safe. That is why we all agree that we need better controls while we preserve all that is good about the online world, including free speech.

This Bill is an example of how legislation can benefit from a collegiate, cross-party approach. I know because I have served on the Select Committee and the Joint Committee, both of which produced reports on the Bill. The Bill is ambitious and much of it is good, but there are some holes in the legislation and we must make important improvements before it is passed.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

Does the hon. Gentleman, with whom I served on the Joint Committee on the draft Bill, agree, having listened to the evidence of the whistleblower Frances Haugen about how disinformation was used in the US Capitol insurrection, that it is completely inadequate that there is only one clause on the subject in the Bill?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

Yes, and I shall return to that point later in my speech.

The Secretary of State’s powers in the Bill need to be addressed. From interested charities to the chief executive of Ofcom, there is consensus that the powers of the Secretary of State in the legislation are too wide. Child safety campaigners, human rights groups, women and girls’ charities, sports groups and democracy reform campaigners all agree that the Secretary of State’s powers threaten the independence of the regulator. That is why both the Joint Committee and the Select Committee have, unanimously and across party lines, recommended reducing the proposed powers.

We should be clear about what exactly the proposed powers will do. Under clause 40, the Secretary of State will be able to modify the draft codes of practice, thus allowing the UK Government a huge amount of power over the independent communications regulator, Ofcom. The Government have attempted to play down the powers, saying that they would be used only in “exceptional circumstances”, but the word “exceptional” is nebulous. How frequent is exceptional? All we are told is that the exceptional circumstances could reflect changing Government “public policy”. That is far too vague, so perhaps the Secretary of State will clarify the difference between public policy and Government policy and give us some further definition of “exceptional”.

While of course I am sure Members feel certain that the current Secretary of State would exercise her powers in a calm and level-headed way, imagine if somebody intemperate held her post or—heaven forfend—a woke, left-wing snowflake from the Labour Benches did. The Secretary of State should listen to her own MPs and reduce her powers in the Bill.

Let me turn to misinformation and disinformation. The Bill aims not only to reduce abuse online but to reduce harm more generally. That cannot be done without including in the Bill stronger provisions on disinformation. As a gay man, I have been on the receiving end of abuse for my sexuality, and I have seen the devasting effect that misinformation and disinformation have had on my community. Disinformation has always been weaponised to spread hate; however, the pervasive reach of social media makes disinformation even more dangerous.

The latest battle ground for LGBT rights has seen an onslaught against trans people. Lies about them and their demand for enhanced civil rights have swirled uncontrollably. Indeed, a correspondent of mine recently lamented “trans funding” in the north-east of Scotland, misreading and misunderstanding and believing it to involve the compulsory regendering of retiring oil workers in receipt of transitional funding from the Scottish Government. That is absurd, of course, but it says something about the frenzied atmosphere stirred up by online transphobes.

The brutal Russian invasion of Ukraine, with lies spewed by the Russian Government and their media apologists, has, like the covid pandemic, illustrated some of the other real-world harms arising from disinformation. It is now a weapon of war, with serious national security implications, yet the UK Government still do not seem to be taking it seriously enough. Full Fact, the independent fact-checking service, said that there is currently no credible plan to tackle disinformation. The Government may well argue that disinformation will fall under the false communications provision in clause 151, but in practice it sets what will likely be an unmeetable bar for services. As such, most disinformation will be dealt with as harmful content.

We welcome the Government’s inclusion of functionality in the risk assessments, which will look not just at content but how it spreads. Evidence from the two Committees shows that the dissemination of harm is as important as the content itself, but the Government should be more explicit in favouring content-neutral modes for reducing disinformation, as this will have less of an impact on freedom of speech. That was recommended by the Facebook whistleblowers Sophie Zhang and Frances Haugen.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

No, I will make some progress, if I may.

A vital tool in countering disinformation is education, and Estonia—an early and frequent victim of Russian disinformation—is a remarkable case study. That is why the Government’s decision to drop Ofcom’s clause 104 media duties is perplexing. Media literacy should be a shared responsibility for schools, Government, and wider society. Spreading and enhancing media literacy should be up to not just Ofcom, but the larger platforms too. Ofcom should also be allowed to break platform terms and conditions for the purposes of investigation. For example, it would currently be unable to create fake profiles to analyse various companies’ behaviour, such as their response to abuse. It would empower the regulator.

Various issues arise when trying to legislate for harm that is not currently illegal. This is challenging for us as legislators since we do not know exactly what priority harms will be covered by secondary legislation, but we would like assurances from the Government that Zach’s law, as it has come to be known, will become a standalone offence. Vicious cowards who send seizure-inducing flashing images to people with epilepsy to trigger seizures must face criminal consequences. The Minister told me in a previous debate that this wicked behaviour will now be covered by the harmful communications offence under clause 150, but until a specific law is on the statute book, he will, I imagine, understand families’ desire for certainty.

Finally, I turn to cross-platform abuse. There has been a terrifying increase in online child abuse over the past three years. Grooming offences have increased by 70% in that period. The Select Committee and the Joint Committee received a host of recommendations which, disappointingly, seem to have been somewhat ignored by the Government. On both Committees, we have been anxious to reduce “digital breadcrumbing”, which is where paedophiles post images of children which may look benign and will not, therefore, be picked up by scanners. However, the aim is to induce children, or to encourage other paedophiles, to leave the regulated site and move to unregulated sites where they can be abused with impunity. I urge the Secretary of State to heed the advice of the National Society for the Prevention of Cruelty to Children. Without enacting the measures it recommends, children are at ever greater risk of harm.

The House will have noted that those on the SNP Benches have engaged with the Government throughout this process. Indeed, I am the only Member to have sat on both the Joint Committee and the Select Committee as this Bill has been considered and our reports written. It has been a privilege to hear from an incredible range of witnesses, some of whom have displayed enormous bravery in giving their testimony.

We want to see this legislation succeed. That there is a need for it is recognised across the House—but across the House, including on the Tory Benches, there is also recognition that the legislation can and must be improved. It is our intention to help to improve the legislation without seeking party advantage. I hope the Secretary of State will engage in the same constructive manner.

21:39
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is an honour to close this debate on behalf of the Opposition. Sadly, there is so little time for the debate that there is much that we will not even get to probe, including any mention of the Government’s underfunded and ill-thought-through online media strategy.

However, we all know that change and regulation of the online space are much needed, so Labour welcomes this legislation even in its delayed form. The current model, which sees social media platforms and tech giants making decisions about what content is hosted and shared online, is simply failing. It is about time that that model of self-regulation, which gives too much control to Silicon Valley, was challenged.

Therefore, as my hon. Friend the Member for Manchester Central (Lucy Powell) said, Labour broadly supports the principles of the Bill and welcomes some aspects of the Government’s approach, including the duty of care frameworks and the introduction of an independent regulator, Ofcom. It cannot and should not be a matter for the Government of the time to control what people across the UK are able to access online. Labour will continue to work hard to ensure that Ofcom remains truly independent of political influence.

We must also acknowledge, however, that after significant delays this Bill is no longer world leading. The Government first announced their intention to regulate online spaces all the way back in 2018. Since then, the online space has remained unregulated and, in many cases, has perpetuated dangerous and harmful misinformation with real-world consequences. Colleagues will be aware of the sheer amount of coronavirus vaccine disinformation so easily accessed by millions online at the height of the pandemic. Indeed, in many respects, it was hard to avoid.

More recently, the devastating impact of state disinformation at the hands of Putin’s regime has been clearer than ever, almost two years after Parliament’s own Intelligence and Security Committee called Russian influence in the UK “the new normal”.

Deidre Brock Portrait Deidre Brock
- Hansard - - - Excerpts

Does the hon. Lady share my disappointment and concern that the Bill does nothing to address misinformation and disinformation in political advertising? A rash of very aggressive campaign groups emerged before the last Scottish Parliament elections, for example; they spent heavily on online political advertising, but were not required to reveal their political ties or funding sources. That is surely not right.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I share the hon. Lady’s concern. There is so much more that is simply missing from this Bill, which is why it is just not good enough. We have heard in this debate about a range of omissions from the Bill and the loopholes that, despite the years of delay, have still not been addressed by the Government. I thank hon. Members on both sides of the House for pointing those out. It is a shame that we are not able to address them individually here, but we will probe those valued contributions further in the Bill Committee.

Despite huge public interest and a lengthy prelegislative scrutiny process, the Government continue to ignore many key recommendations, particularly around defining and regulating both illegal and legal but harmful content online. The very nature of the Bill and its heavy reliance on secondary legislation to truly flesh out the detail leaves much to be desired. We need to see action now if we are truly to keep people safe online.

Most importantly, this Bill is an opportunity, and an important one at that, to decide the kind of online world our children grow up in. I know from many across the House that growing up online as children do now is completely unimaginable. When I was young, we played Snake on a Nokia 3310, and had to wait for the dial-up and for people to get off the phone in order to go online and access MSN, but for people today access to the internet, social media and everything that brings is a fundamental part of their lives.

Once again, however, far too much detail, and the specifics of how this legislation will fundamentally change the user experience, is simply missing from the Bill. When it comes to harmful content that is not illegal, the Government have provided no detail. Despite the Bill’s being years in the making, we are no closer to understanding the impact it will have on users.

The Bill in its current draft has a huge focus on the tools for removing and moderating harmful content, rather than ensuring that design features are in place to make services systematically safer for all of us. The Government are thus at real risk of excluding children from being able to participate in the digital world freely and safely. The Bill must not lock children out of services they are entitled to use; instead, it must focus on making those services safe by design.

I will push the Minister on this particular point. We are all eager to hear what exact harms platforms will have to take steps to address and mitigate. Will it be self-harm? Will it perhaps be content promoting eating disorders, racism, homophobia, antisemitism and misogyny? One of the key problems with the Bill is the failure to make sure that the definitions of “legal but harmful” content are laid out within it. Will the Minister therefore commit to amending the Bill to address this and to allow for proper scrutiny? As we have heard, the Government have also completely failed to address what stakeholders term the problem of breadcrumbing. I would be grateful if the Minister outlined what steps the Government will be taking to address this issue, as there is clearly a loophole in the Bill that would allow this harmful practice to continue.

As we have heard, the gaps in the Bill, sadly, do not end there. Women and girls are disproportionately likely to be affected by online abuse and harassment. Online violence against women and girls is defined as including but not limited to

“intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive ‘sexting’, and the creation and sharing of ‘deepfake’ pornography.”

This Bill is an important step forward but it will need significant strengthening to make online spaces safe for women and girls. While we welcome the steps by the Government to include cyber-flashing in the Bill, it must go further in other areas. Misogyny should be included as a harm to adults that online platforms have a duty to prevent from appearing on them. As colleagues will be aware, Instagram has been completely failing to tackle misogynistic abuse sent via direct message. The Centre for Countering Digital Hate has exposed what it terms an “epidemic of misogynistic abuse”, 90% of which has been completely and utterly ignored by Instagram, even when it has been reported to moderators. The Government must see sense and put violence against women and girls into the Bill, and it must also form a central pillar of regulation around legal but harmful content. Will the Minister therefore commit to at least outlining the definitions of “legal but harmful” content, both for adults and children, in the Bill?

Another major omission from the Bill in as currently drafted is its rather arbitrary categorisation of platforms based on size versus harm. As mentioned by many hon. Members, the categorisation system as it currently stands will completely fail to address some of the most extreme harms on the internet. Thanks to the fantastic work of organisations such as Hope not Hate and the Antisemitism Policy Trust, we know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote extremely dangerous content. The Minister must accept that his Department has been completely tone-deaf on this particular point, and—he must listen to what hon. Members have said today—its decision making utterly inexplicable. Rather than an arbitrary size cut-off, the regulator must instead use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net. Exactly when will the Minister’s Department publish more information on the detail around this categorisation system? Exactly what does he have to say to those people, including many Members here today, who have found themselves the victim of abusive content that has originated on these hate-driven smaller platforms? How will this Bill change their experience of being online? I will save him the energy, because we all know the real answer: it will do little to change the situation.

This Bill was once considered a once-in-a-generation opportunity to improve internet safety for good, and Labour wants to work with the Government to get this right. Part of our frustration is due to the way in which the Government have failed to factor technological change and advancement—which, as we all know, and as we have heard today, can be extremely rapid—into the workings of this Bill. While the Minister and I disagree on many things, I am sure that we are united in saying that no one can predict the future, and that is not where my frustrations lie. Instead, I feel that the Bill has failed to address issues that are developing right now—from developments in online gaming to the expansion of the metaverse. These are complicated concepts but they are also a reality that we as legislators must not shy away from.

The Government have repeatedly said that the Bill’s main objective is to protect children online, and of course it goes without saying that Labour supports that. Yet with the Bill being so restricted to user-to-user services, there are simply too many missed opportunities to deal with areas where children, and often adults, are likely to be at risk of harm. Online gaming is a space that is rightly innovative and fast-changing, but the rigid nature of how services have been categorised will soon mean that the Bill is outdated long before it has had a chance to have a positive impact. The same goes for the metaverse.

While of course Labour welcomes the Government’s commitment to prevent under-18s from accessing pornography online, the Minister must be realistic. A regime that seeks to ban rather than prevent is unlikely to ever be able to keep up with the creative, advanced nature of the tech industry. For that reason, I must press the Minister on exactly how this Bill will be sufficiently flexible and future-proofed to avoid a situation whereby it is outdated by the time it finally receives Royal Assent. We must make sure that we get this right, and the Government know that they could and can do more. I therefore look forward to the challenge and to working with colleagues across the House to strengthen this Bill throughout its passage.

21:49
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

The piece of legislation before the House this evening is truly groundbreaking, because no other jurisdiction anywhere in the world has attempted to legislate as comprehensively as we are beginning to legislate here. For too long, big tech companies have exposed children to risk and harm, as evidenced by the tragic suicide of Molly Russell, who was exposed to appalling content on Instagram, which encouraged her, tragically, to take her own life. For too long, large social media firms have allowed illegal content to go unchecked online.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I have spoken before about dangerous suicide-related content online. The Minister mentions larger platforms. Will the Government go away and bring back two amendments based on points made by the Samaritans? One would bring smaller platforms within the scope of sanctions, and the second would make the protective aspects of the Bill cover people who are over 18, not just those who are under 18. If the Government do that, I am sure that it will be cause for celebration and that Members on both sides of the House will give their support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is very important to emphasise that, regardless of size, all platforms in the scope of the Bill are covered if there are risks to children.

A number of Members, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy), have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered. However, all platforms, regardless of size, are in scope with regard to content that is illegal and to content that is harmful to children.

For too long, social media firms have also arbitrarily censored content just because they do not like it. With the passage of this Bill, all those things will be no more, because it creates parliamentary sovereignty over how the internet operates, and I am glad that the principles in the Bill command widespread cross-party support.

The pre-legislative scrutiny that we have gone through has been incredibly intensive. I thank and pay tribute to the DCMS Committee and the Joint Committee for their work. We have adopted 66 of the Joint Committee’s recommendations. The Bill has been a long time in preparation. We have been thoughtful, and the Government have listened and responded. That is why the Bill is in good condition.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I must make some progress, because I am almost out of time and there are lots of things to reply to.

I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. Really, people just ought to have more courtesy than to get up and, when there is still business going on in this House, to behave as if it is not sitting because it is after 10 o’clock. We really have to observe courtesy at all times in here.

Online Safety Bill (Carry-Over)

Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)),

That if, at the conclusion of this Session of Parliament, proceedings on the Online Safety Bill have not been completed, they shall be resumed in the next Session.—(Michael Tomlinson.)

Question agreed to.