Online Safety Bill

(Limited Text - Ministerial Extracts only)

Read Full debate
2nd reading
Tuesday 19th April 2022

(2 years, 7 months ago)

Commons Chamber
Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Watch Debate
Nadine Dorries Portrait The Secretary of State for Digital, Culture, Media and Sport (Ms Nadine Dorries)
- View Speech - Hansard - - - Excerpts

I beg to move, That the Bill be now read a Second time.

Given the time and the number of people indicating that they wish to speak, and given that we will have my speech, the shadow Minister’s speech and the two winding-up speeches, there might be 10 minutes left for people to speak. I will therefore take only a couple of interventions and speak very fast in the way I can, being northern.

Almost every aspect of our lives is now conducted via the internet, from work and shopping to keeping up with our friends, family and worldwide real-time news. Via our smartphones and tablets, we increasingly spend more of our lives online than in the real world.

In the past 20 years or so, it is fair to say that the internet has overwhelmingly been a force for good, for prosperity and for progress, but Members on both sides of the House will agree that, as technology advances at warp speed, so have the new dangers this progress presents to children and young people.

Mark Francois Portrait Mr Mark Francois (Rayleigh and Wickford) (Con)
- Hansard - - - Excerpts

My right hon. Friend will know that, last Wednesday, the man who murdered our great friend Sir David Amess was sentenced to a whole-life term. David felt very strongly that we need legislation to protect MPs, particularly female MPs, from vile misogynistic abuse. In his memory, will she assure me that her Bill will honour the spirit of that request?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Sir David was a friend to all of us, and he was very much at the forefront of my mind during the redrafting of this Bill over the last few months. I give my right hon. Friend my absolute assurance on that.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

A number of constituents have contacted me over the last few months about eating disorders, particularly anorexia and bulimia, and about bullying in schools. Will the Secretary of State assure me and this House that those concerns will be addressed by this Bill so that my constituents are protected?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

They will. Inciting people to take their own life or encouraging eating disorders in anorexia chatrooms—all these issues are covered by the Bill.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention.

Jonathan Gullis Portrait Jonathan Gullis (Stoke-on-Trent North) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend, and I thank her for her written communications regarding Angela Stevens, the mother of Brett, who tragically took his own life having been coerced by some of these vile online sites. The Law Commission considered harmful online communications as part of the Bill’s preparation, and one of its recommendations is to introduce a new offence of encouraging or assisting self-harm. I strongly urge my right hon. Friend to adopt that recommendation. Can she say more on that?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Yes. Exactly those issues will be listed in secondary legislation, under “legal but harmful”. I will talk about that further in my speech, but “legal but harmful” focuses on some of the worst harmful behaviours. We are talking not about an arbitrary list, but about incitement to encourage people to take their own life and encouraging people into suicide chatrooms—behaviour that is not illegal but which is indeed harmful.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I am going to whizz through my speech now in order to allow people who have stayed and want to speak to do so.

As the Minister for mental health for two years, too often, I heard stories such as the one just highlighted by my hon. Friend the Member for Stoke-on-Trent North (Jonathan Gullis). We have all sat down with constituents and listened as the worst stories any parents could recount were retold: stories of how 14-year-old girls take their own life after being directed via harmful algorithms into a suicide chatroom; and of how a child has been bombarded with pro-anorexia content, or posts encouraging self-harm or cyber-bullying.

School bullying used to stop at the school gate. Today, it accompanies a child home, on their mobile phone, and is lurking in the bedroom waiting when they switch on their computer. It is the last thing a bullied child reads at night before they sleep and the first thing they see when they wake in the morning. A bullied child is no longer bullied in the playground on school days; they are bullied 24 hours a day, seven days a week. Childhood innocence is being stolen at the click of a button. One extremely worrying figure from 2020 showed that 80% of 12 to 15-year-olds had at least one potentially harmful online experience in the previous year.

We also see this every time a footballer steps on to the pitch, only to be subjected to horrific racism online, including banana and monkey emojis. As any female MP in this House will tell you, a woman on social media—I say this from experience—faces a daily barrage of toxic abuse. It is not criticism—criticism is a fair game—but horrific harassment and serious threats of violence. Trolls post that they hope we get raped or killed, urge us to put a rope around our neck, or want to watch us burn in a car alive—my own particular experience.

All this behaviour is either illegal or, almost without exception, explicitly banned in a platform’s terms and conditions. Commercially, it has to be. If a platform stated openly that it allowed such content on its sites, which advertisers, its financial lifeblood, would knowingly endorse and advertise on it? Which advertisers would do that? Who would openly use or allow their children to use sites that state that they allow illegal and harmful activity? None, I would suggest, and platforms know that. Yet we have almost come to accept this kind of toxic behaviour and abuse as part and parcel of online life. We have factored online abuse and harm into our daily way of life, but it should not and does not have to be this way.

This Government promised in their manifesto to pass legislation to tackle these problems and to make the UK the

“safest place in the world to be online”

especially for children. We promised legislation that would hold social media platforms to the promises they have made to their own users—their own stated terms and conditions—promises that too often are broken with no repercussions. We promised legislation that would bring some fundamental accountability to the online world. That legislation is here in the form of the ground- breaking Online Safety Bill. We are leading the way and free democracies across the globe are watching carefully to see how we progress this legislation.

The Bill has our children’s future, their unhindered development and their wellbeing at its heart, while at the same time providing enhanced protections for freedom of speech. At this point, I wish to pay tribute to my predecessors, who have each trodden the difficult path of balancing freedom of speech and addressing widespread harms, including my immediate predecessor and, in particular, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), who worked so hard, prior to my arrival in the Department for Digital, Culture, Media and Sport, with stakeholders and platforms, digging in to identify the scope of the problem.

Let me summarise the scope of the Bill. We have reserved our strongest measures in this legislation for children. For the first time, platforms will be required under law to protect children and young people from all sorts of harm, from the most abhorrent child abuse to cyber-bullying and pornography. Tech companies will be expected to use every possible tool to do so, including introducing age-assurance technologies, and they will face severe consequences if they fail in the most fundamental of requirements to protect children. The bottom line is that, by our passing this legislation, our youngest members of society will be far safer when logging on. I am so glad to see James Okulaja and Alex Holmes from The Diana Award here today, watching from the Gallery as we debate this groundbreaking legislation. We have worked closely with them as we have developed the legislation, as they have dedicated a huge amount of their time to protecting children from online harms. This Bill is for them and those children.

The second part of the Bill makes sure that platforms design their services to prevent them from being abused by criminals. When illegal content does slip through the net, such as child sex abuse and terrorist content, they will need to have effective systems and processes in place to quickly identify it and remove it from their sites. We will not allow the web to be a hiding place or a safe space for criminals. The third part seeks to force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all sorts of other unacceptable behaviour that they claim not to allow but that ruins life in practice. In other words, we are just asking the largest platforms to simply do what they say they will do, as we do in all good consumer protection measures in any other industry. If platforms fail in any of those basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down the full weight of the law upon them.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take just two more interventions and that will be it, otherwise people will not have a chance to speak.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I am very grateful to my right hon. Friend for giving way. The internet giants that run the kind of awful practices that she has described have for too long been unaccountable, uncaring and unconscionable in the way they have fuelled every kind of spite and fed every kind of bigotry. Will she go further in this Bill and ensure that, rather like any other publisher, if those companies are prepared to allow anonymous posts, they are held accountable for those posts and subject to the legal constraints that a broadcaster or newspaper would face?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

These online giants will be held accountable to their own terms and conditions. They will be unable any longer to allow illegal content to be published, and we will also be listing in secondary legislation offences that will be legal but harmful. We will be holding those tech giants to account.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I thank the Secretary of State for giving way. She talked about how this Bill is going to protect children much more, and it is a welcome step forward. However, does she accept that there are major gaps in this Bill? For instance, gaming is not covered. It is not clear whether things such as virtual reality and the metaverse are going to be covered. [Interruption.] It is not clear and all the experts will tell us that. The codes of practice in the Bill are only recommended guidance; they are not mandatary and binding on companies. That will encourage a race to the bottom.

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

The duties are mandatory; it is the Online Safety Bill and the metaverse is included in the Bill. Not only is it included, but, moving forward, the provisions in the Bill will allow us to move swiftly with the metaverse and other things. We did not even know that TikTok existed when this Bill started its journey. These provisions will allow us to move quickly to respond.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention, but that is it.

Damian Green Portrait Damian Green (Ashford) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend for giving way. One of the most important national assets that needs protecting in this Bill and elsewhere is our reputation for serious journalism. Will she therefore confirm that, as she has said outside this House, she intends to table amendments during the passage of the Bill that will ensure that platforms and search engines that have strategic market status protect access to journalism and content from recognised news publishers, ensuring that it is not moderated, restricted or removed without notice or right of appeal, and that those news websites will be outside the scope of the Bill?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

We have already done that—it is already in the Bill.

Daniel Kawczynski Portrait Daniel Kawczynski (Shrewsbury and Atcham) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

No, I have to continue.

Not only will the Bill protect journalistic content, democratic content and democratic free speech, but if one of the tech companies wanted to take down journalistic content, the Bill includes a right of appeal for journalists, which currently does not exist. We are doing further work on that to ensure that content remains online while the appeal takes place. The appeal process has to be robust and consistent across the board for all the appeals that take place. We have already done more work on that issue in this version of the Bill and we are looking to do more as we move forward.

As I have said, we will not allow the web to be a hiding place or safe space for criminals and when illegal content does slip through the net—such as child sex abuse and terrorist content— online platforms will need to have in place effective systems and processes to quickly identify that illegal content and remove it from their sites.

The third measure will force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all the other unacceptable behaviours. In other words, we are asking the largest platforms to do what they say they will do, just as happens with all good consumer-protection measures in any other industry. Should platforms fail in any of their basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down upon them the full weight of the law. Such action includes searching platforms’ premises and confiscating their equipment; imposing huge fines of up to 10% of their global turnover; pursuing criminal sanctions against senior managers who fail to co-operate; and, if necessary, blocking their sites in the UK.

We know that tech companies can act very quickly when they want to. Last year, when an investigation revealed that Pornhub allowed child sexual exploitation and abuse imagery to be uploaded to its platform, Mastercard and Visa blocked the use of their cards on the site. Lo and behold, threatened with the prospect of losing a huge chunk of its profit, Pornhub suddenly removed nearly 10 million child sexual exploitation videos from its site overnight. These companies have the tools but, unfortunately, as they have shown time and again, they need to be forced to use them. That is exactly what the Bill will do.

Before I move on, let me point out something very important: this is not the same Bill as the one published in draft form last year. I know that Members throughout the House are as passionate as I am about getting this legislation right, and I had lots of constructive feedback on the draft version of the Bill. I have listened carefully to all that Members have had to say throughout the Bill’s process, including by taking into account the detailed feedback from the Joint Committee, the Digital, Culture, Media and Sport Committee and the Petitions Committee. They have spent many hours considering every part of the Bill, and I am extremely grateful for their dedication and thorough recommendations on how the legislation could be improved.

As a result of that feedback process, over the past three months or so I have strengthened the legislation in a number of important ways. There were calls for cyber-flashing to be included; cyber-flashing is now in the Bill. There were calls to ensure that the legislation covered all commercial pornography sites; in fact, we have expanded the Bill’s scope to include every kind of provider of pornography. There were concerns about anonymity, so we have strengthened the Bill so that it now requires the biggest tech platforms to offer verification and empowerment tools for adult users, allowing people to block anonymous trolls from the beginning.

I know that countless MPs are deeply concerned about how online fraud—particularly scam ads—has proliferated over the past few years. Under the new version of the Bill, the largest and highest-risk companies—those that stand to make the most profit—must tackle scam ads that appear on their services.

We have expanded the list of priority offences named on the face of the legislation to include not just terrorism and child abuse imagery but revenge porn, fraud, hate crime, encouraging and assisting suicide, and organised immigration crime, among other offences.

If anyone doubted our appetite to go after Silicon Valley executives who do not co-operate with Ofcom, they will see that we have strengthened the Bill so that the criminal sanctions for senior managers will now come into effect as soon as possible after Royal Assent— I am talking weeks, not years. We have expanded the things for which those senior managers will be criminally liable to cover falsifying data, destroying data and obstructing Ofcom’s access to their premises.

In addition to the regulatory framework in the Bill that I have described, we are creating three new criminal offences. While the regulatory framework is focused on holding companies to account, the criminal offences will be focused on individuals and the way people use and abuse online communications. Recommended by the Law Commission, the offences will address coercive and controlling behaviour by domestic abusers; threats to rape, kill or inflict other physical violence; and the sharing of dangerous disinformation deliberately to inflict harm.

This is a new, stronger Online Safety Bill. It is the most important piece of legislation that I have ever worked on and it has been a huge team effort to get here. I am confident that we have produced something that will protect children and the most vulnerable members of society while being flexible and adaptable enough to meet the challenges of the future.

Let me make something clear in relation to freedom of speech. Anyone who has actually read the Bill will recognise that its defining focus is the tackling of serious harm, not the curtailing of free speech or the prevention of adults from being upset or offended by something they have seen online. In fact, along with countless others throughout the House, I am seriously concerned about the power that big tech has amassed over the past two decades and the huge influence that Silicon Valley now wields over public debate.

We in this place are not the arbiters of free speech. We have left it to unelected tech executives on the west coast to police themselves. They decide who is and who is not allowed on the internet. They decide whose voice should be heard and whose should be silenced—whose content is allowed up and what should be taken down. Too often, their decisions are arbitrary and inconsistent. We are left, then, with a situation in which the president of the United States can be banned by Twitter while the Taliban is not; in which talkRADIO can be banned by YouTube for 12 hours; in which an Oxford academic, Carl Heneghan, can be banned by Twitter; or in which an article in The Mail on Sunday can be plastered with a “fake news” label—all because they dared to challenge the west coast consensus or to express opinions that Silicon Valley does not like.

It is, then, vital that the Bill contains strong protections for free speech and for journalistic content. For the first time, under this legislation all users will have an official right to appeal if they feel their content has been unfairly removed. Platforms will have to explain themselves properly if they remove content and will have special new duties to protect journalistic content and democratically important content. They will have to keep those new duties in mind whenever they set their terms and conditions or moderate any content on their sites. I emphasise that the protections are new. The new criminal offences update section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003, which were so broad that they interfered with free speech while failing to address seriously harmful consequences.

Without the Bill, social media companies would be free to continue to arbitrarily silence or cancel those with whom they do not agree, without any need for explanation or justification. That situation should be intolerable for anyone who values free speech. For those who quite obviously have not read the Bill and say that it concedes power to big tech companies, I have this to say: those big tech companies have all the power in the world that they could possibly want, right now. How much more power could we possibly concede?

That brings me to my final point. We now face two clear options. We could choose not to act and leave big tech to continue to regulate itself and mark its own homework, as it has been doing for years with predictable results. We have already seen that too often, without the right incentives, tech companies will not do what is needed to protect their users. Too often, their claims about taking steps to fix things are not backed up by genuine actions.

I can give countless examples from the past two months alone of tech not taking online harm and abuse seriously, wilfully promoting harmful algorithms or putting profit before people. A recent BBC investigation showed that women’s intimate pictures were being shared across the platform Telegram to harass, shame and blackmail women. The BBC reported 100 images to Telegram as pornography, but 96 were still accessible a month later. Tech did not act.

Twitter took six days to suspend the account of rapper Wiley after his disgusting two-day antisemitic rant. Just last week, the Centre for Countering Digital Hate said that it had reported 253 accounts to Instagram as part of an investigation into misogynistic abuse on the platform, but almost 90% remained active a month later. Again, tech did not act.

Remember: we have been debating these issues for years. They were the subject of one of my first meetings in this place in 2005. During that time, things have got worse, not better. If we choose the path of inaction, it will be on us to explain to our constituents why we did nothing to protect their children from preventable risks, such as grooming, pornography, suicide content or cyber-bullying. To those who say protecting children is the responsibility of parents, not the job of the state, I would quote the 19th-century philosopher John Stuart Mill, one of the staunchest defenders of individual freedom. He wrote in “On Liberty” that the role of the state was to fulfil the responsibility of the parent in order to protect a child where a parent could not. If we choose not to act, in the years to come we will no doubt ask ourselves why we did not act to impose fundamental online protections.

However, we have another option. We can pass this Bill and take huge steps towards tackling some of the most serious forms of online harm: child abuse, terrorism, harassment, death threats, and content that is harming children across the UK today. We could do what John Stuart Mill wrote was the core duty of Government. The right to self-determination is not unlimited. An action that results in doing harm to another is not only wrong, but wrong enough that the state can intervene to prevent that harm from occurring. We do that in every other part of our life. We erect streetlamps to make our cities and towns safer. We put speed limits on our roads and make seatbelts compulsory. We make small but necessary changes to protect people from grievous harm. Now it is time to bring in some fundamental protections online.

We have the legislation ready right now in the form of the Online Safety Bill. All we have to do is pass it. I am proud to commend the Bill to the House.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

In the interest of time, I will just pose a number of questions, which I hope the Minister might address in summing up. The first is about the scope of the Bill. The Joint Committee of which I was a member recommended that the age-appropriate design code, which is very effectively used by the Information Commissioner, be used as a benchmark in the Bill, so that any services accessed or likely to be accessed by children are regulated for safety. I do not understand why the Government rejected that suggestion, and I would be pleased to hear from the Minister why they did so.

Secondly, the Bill delegates lots of detail to statutory instruments, codes of practice from the regulator, or later decisions by the Secretary of State. Parliament must see that detail before the Bill becomes an Act. Will the Minister commit to those delegated decisions being published before the Bill becomes an Act? Could he explain why the codes of practice are not being set as mandatory? I do not understand why codes of practice, much of the detail of which the regulator is being asked to set, will not be made mandatory for businesses. How can minimum standards for age or identity verification be imposed if those codes of practice are not made mandatory? Perhaps the Minister could explain.

Many users across the country will want to ensure that their complaints are dealt with effectively. We recommended an ombudsman service that dealt with complaints that were exhausted through a complaints system at the regulated companies, but the Government rejected it. Please could the Minister explain why?

I was pleased that the Government accepted the concept of the ability for a super-complaint to be brought on behalf of groups of users, but the decision as to who will be able a bring a super-complaint has been deferred, subject to a decision by the Secretary of State. Why, and when will that decision be taken? If the Minister could allude to who they might be, I am sure that would be welcome.

Lastly, there is a number of exemptions and more work to be done, which leaves significant holes in the legislation. There is much more work to be done on clauses 5, 6 and 50—on democratic importance, journalism and the definition of journalism, on the exemptions for news publishers, and on disinformation, which is mentioned only once in the entire Bill. I and many others recognise that these are not easy issues, but they should be considered fully before legislation is proposed that has gaping holes for people who want to get around it, and for those who wish to test the parameters of this law in the courts, probably for many years. All of us, on a cross-party basis in this House, support the Government’s endeavours to make it safe for children and others to be online. We want the legislation to be implemented as quickly as possible and to be as effective as possible, but there are significant concerns that it will be jammed up in the judicial system, where this House is unacceptably giving judges the job of fleshing out the definition of what many of the important exemptions will mean in practice.

The idea that the Secretary of State has the power to intervene with the independent regulator and tell it what it should or should not do obviously undermines the idea of an independent regulator. While Ministers might give assurances to this House that the power will not be abused, I believe that other countries, whether China, Russia, Turkey or anywhere else, will say, “Look at Great Britain. It thinks this is an appropriate thing to do. We’re going to follow the golden precedent set by the UK in legislating on these issues and give our Ministers the ability to decide what online content should be taken down.” That seems a dangerous precedent.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

The Minister is shaking his head, but I can tell him that the legislation does do that, because we looked at this and took evidence on it. The Secretary of State would be able to tell the regulator that content should be “legal but harmful” and therefore should be removed as part of its systems design online. We also heard that the ability to do that at speed is very restricted and therefore the power is ineffective in the first place. Therefore, the Government should evidently change their position on that. I do not understand why, in the face of evidence from pretty much every stakeholder, the Government agree that that is an appropriate use of power or why Parliament would vote that through.

I look forward to the Minister giving his answers to those questions, in the hope that, as the Bill proceeds through the House, it can be tidied up and made tighter and more effective, to protect children and adults online in this country.

--- Later in debate ---
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

The piece of legislation before the House this evening is truly groundbreaking, because no other jurisdiction anywhere in the world has attempted to legislate as comprehensively as we are beginning to legislate here. For too long, big tech companies have exposed children to risk and harm, as evidenced by the tragic suicide of Molly Russell, who was exposed to appalling content on Instagram, which encouraged her, tragically, to take her own life. For too long, large social media firms have allowed illegal content to go unchecked online.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I have spoken before about dangerous suicide-related content online. The Minister mentions larger platforms. Will the Government go away and bring back two amendments based on points made by the Samaritans? One would bring smaller platforms within the scope of sanctions, and the second would make the protective aspects of the Bill cover people who are over 18, not just those who are under 18. If the Government do that, I am sure that it will be cause for celebration and that Members on both sides of the House will give their support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is very important to emphasise that, regardless of size, all platforms in the scope of the Bill are covered if there are risks to children.

A number of Members, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy), have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered. However, all platforms, regardless of size, are in scope with regard to content that is illegal and to content that is harmful to children.

For too long, social media firms have also arbitrarily censored content just because they do not like it. With the passage of this Bill, all those things will be no more, because it creates parliamentary sovereignty over how the internet operates, and I am glad that the principles in the Bill command widespread cross-party support.

The pre-legislative scrutiny that we have gone through has been incredibly intensive. I thank and pay tribute to the DCMS Committee and the Joint Committee for their work. We have adopted 66 of the Joint Committee’s recommendations. The Bill has been a long time in preparation. We have been thoughtful, and the Government have listened and responded. That is why the Bill is in good condition.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I must make some progress, because I am almost out of time and there are lots of things to reply to.

I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.