Thursday 13th January 2022

(2 years, 2 months ago)

Commons Chamber
Read Hansard Text Read Debate Ministerial Extracts
[Relevant documents: E-petition 575833, Make verified ID a requirement for opening a social media account, and e-petition 332315, Ban anonymous accounts on social media”; oral evidence taken before the Petitions Committee on 21 May and 2 July 2020, and 2, 16 and 23 November, and 1 December 2021, on Tackling Online Abuse, HC 364 [Session 2019-21] and HC 479.]
14:55
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Parliament Live - Hansard - - - Excerpts

I beg to move,

That this House has considered the Report of the Joint Committee on the draft Online Safety Bill, HC 609.

I would like to start by thanking the members and Clerks of our Joint Committee, who put in a tremendous effort to deliver its report. In 11 sitting weeks, we received more than 200 submissions of written evidence, took oral evidence from 50 witnesses and held four further roundtable meetings with outside experts, as well as Members of both Houses. I am delighted to see my Joint Committee colleagues Lord Gilbert and Baroness Kidron in the Gallery. I thank the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Croydon South (Chris Philp), and the Secretary of State for the open and collaborative way in which they worked with the Committee throughout the process and our deliberations. I also thank Ofcom, which provided a lot of constructive guidance and advice to the Committee as we prepared the report.

This feels like a moment that has been a long time coming. There has been huge interest on both sides of the House in the Online Safety Bill ever since the publication of the first White Paper in April 2019, and then there were two Government responses, the publication of the draft Bill and a process of pre-legislative scrutiny by the Joint Committee. I feel that the process has been worth while: in producing a unanimous report, I think the Committee has reflected the wide range of opinions that we received and put forward some strong ideas that will improve the Bill, which I hope will get a Second Reading later in the Session. I believe that it has been a process worth undertaking, and many other Lords and Commons Committees have been looking at the same time at the important issues around online safety and the central role that online services play in our lives.

The big tech companies have had plenty of notice that this is coming. During that period, have we seen a marked improvement? Have we seen the introduction of effective self-regulation? Have the companies set a challenge to Parliament, saying “You don’t really need to pass this legislation, because we are doing all we can already”? No. If anything, the problems have got worse. Last year, we saw an armed insurrection in Washington DC in which a mob stormed the Capitol building, fuelled by messages of hate and confrontation that circulated substantially online. Last summer, members of the England football team were subject to vile racist abuse at the end of the final—the football authorities had warned the companies that that could happen, but they did not prepare for it or act adequately at the time.

As Rio Ferdinand said in evidence to the Joint Committee, people should not have to put up with this. People cannot just put their device down—it is a tool that they use for work and to stay in communication with their family and friends—so they cannot walk away from the abuse. If someone is abused in a room, they can leave the room, but they cannot walk away from a device that may be the first thing that they see in the morning and one of the last things that they see at night.

We have seen an increase in the incidence of child abuse online. The Internet Watch Foundation has produced a report today that shows that yet again there are record levels of abusive material related to children, posing a real child safety risk. It said the same in its report last year, and the issues are getting worse. Throughout the pandemic, we have seen the rise of anti-vaccine conspiracies.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Parliament Live - Hansard - - - Excerpts

I commend the hon. Gentleman for bringing this forward. We have a colleague in Northern Ireland, Diane Dodds MLA, who has had unbelievably vile abuse towards her and her family. Does the hon. Gentleman agree that there is a huge loophole and gap in this Bill—namely, that the anonymity clause remains that allows comments such as those to my colleague and friend Diane Dodds, which were despicable in the extreme? There will be no redress and no one held accountable through this Bill. The veil of anonymity must be lifted and people made to face the consequences of what they are brave enough to type but not to say.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. The hon. Gentleman is not trying to make a speech, is he? No, he is not.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman raises an important issue. The Committee agreed in the report that there must be an expedited process of transparency, so that when people are using anonymity to abuse other people—saying things for which in public they might be sued or have action taken against them—it must be much easier to swiftly identify who those people are. People must know that if they post hate online directed at other people and commit an offence in doing so, their anonymity will not be a shield that will protect them: they will be identified readily and action taken against them. Of course there are cases where anonymity may be required, when people are speaking out against an oppressive regime or victims of abuse are telling their story, but it should not be used as a shield to abuse others. We set that out in the report and the hon. Gentleman is right that the Bill needs to move on it.

We are not just asking the companies to moderate content; we are asking them to moderate their systems as well. Their systems play an active role in directing people towards hate and abuse. A study commissioned by Facebook showed that over 60% of people who joined groups that showed extremist content did so at the active recommendation of the platform itself. In her evidence to the Committee, Facebook whistleblower Frances Haugen made clear the active role of systems in promoting and driving content through to people, making them the target of abuse, and making vulnerable people more likely to be confronted with and directed towards content that will exacerbate their vulnerabilities.

Facebook and companies like it may not have invented hate but they are driving hate and making it worse. They must be responsible for these systems. It is right that the Bill will allow the regulator to hold those companies to account not just for what they do or do not take down, but for the way they use the systems that they have created and designed to make money for themselves by keeping people on them longer, such that they are responsible for them. The key thing at the heart of the Bill and at the heart of the report published by the Joint Committee is that the companies must be held liable for the systems they have created. The Committee recommended a structural change to the Bill to make it absolutely clear that what is illegal offline should be regulated online. Existing offences in law should be written into the Bill and it should be demonstrated how the regulator will set the thresholds for enforcement of those measures online.

This approach has been made possible because of the work of the Law Commission in producing its recommendations, particularly in introducing new offences around actively promoting self-harm and promoting content and information that is known to be false. A new measure will give us the mechanism to deal with malicious deepfake films being targeted at people. There are also necessary measures to make sure that there are guiding principles that the regulator has to work to, and the companies have to work to, to ensure regard to public health in dealing with dangerous disinformation relating to the pandemic or other public health issues.

We also have to ensure an obligation for the regulator to uphold principles of freedom of expression. It is important that effective action should be taken against hate speech, extremism, illegal content and all harmful content that is within the scope of the Bill, but if companies are removing content that has every right to be there—where the positive expression of people’s opinions has every right to be online—then the regulator should have the power to intervene in that direction as well.

At the heart of the regime has to be a system where Ofcom, as the independent regulator, can set mandatory codes and standards that we expect the companies to meet, and then use its powers to investigate and audit them to make sure that they are complying. We cannot have a system that is based on self-declared transparency reports by the companies where even they themselves struggle to explain what the results mean and there is no mechanism for understanding whether they are giving us the full picture or only a highly partial one. The regulator must have that power. Crucially, the codes of practice should set the mandatory minimum standards. We should not have Silicon Valley deciding what the online safety of citizens in this country should be. That should be determined through legislation passed through this Parliament empowering the regulator to set the minimum standards and take enforcement action when they have not been met.

We also believe that the Bill would be improved by removing a controversial area, the principles in clause 11. The priority areas of harm are determined by the Secretary of State and advisory to the companies. If we base the regulatory regime and the codes of practice on established offences that this Parliament has already created, which are known and understood and therefore enforced, we can say they are mandatory and clear and that there has been a parliamentary approval process in creating the offences in the first place.

Where new areas of harm are added to the schedules and the codes of practice, there should be an affirmative procedure in both Houses of Parliament to approve those changes to the code, so that Members have the chance to vote on changes to the codes of practice and the introduction of new offences as a consequence of those offences being created.

The Committee took a lot of evidence on the question of online fraud and scams. We received evidence from the Work and Pensions Committee and the Treasury Committee advising us that this should be done: if a known scam or attempt to rip off and defraud people is present on a website or social media platform, be it through advertising or any kind of posting, it should be within the scope and it should be for the regulator to require its removal. There should not be a general blanket exemption for advertising, which would create a perverse incentive to promote such content more actively.

Kevin Hollinrake Portrait Kevin Hollinrake (Thirsk and Malton) (Con)
- Parliament Live - Hansard - - - Excerpts

I thank my hon. Friend for his work on this important issue. Does he agree, as referred to in the report, that platforms must be required to proactively seek out that content and ensure it is changed, and if not, remove it, rather than all removals being prompted by users?

Damian Collins Portrait Damian Collins
- Parliament Live - Hansard - - - Excerpts

It is vital that companies are made to act proactively. That is one of the problems with the current regime, where action against illegal content is only required once it is reported to the companies and they are not proactively identifying it. My hon. Friend is right about that, particularly with frauds and scams where the perpetrators are known. The role of the regulator is to ensure that companies do not run those ads. The advertising authorities can still take action against individual advertisers, as can the police, but there should be a proactive responsibility on the platforms themselves.

If you will allow me to say one or two more things, Madam Deputy Speaker, we believe it is important that there should be user redress through the system. That is why the Committee recommended creating an ombudsman if complaints have been exhausted without successful resolution, but also permitting civil redress through the courts.

If an individual or their family has been greatly harmed as a consequence of what they have seen on social media, they may take some solace in the fact that the regulator has intervened against the company for its failures and levied fines or taken action against individual directors. However, as an individual can take a case to the courts for a company’s failure to meet its obligations under data protection law, that should also apply to online safety legislation. An individual should have the right, on their own or with others, to sue a company for failing to meet its obligations under an online safety Act.

I commend the report to the House and thank everyone involved in its production for their hard work. This is a Bill we desperately need, and I look forward to seeing it pass through the House in this Session.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

As the House can see, a great many people want to speak this afternoon, so we will have to have a time limit of five minutes with immediate effect.

15:07
Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Parliament Live - Hansard - - - Excerpts

I congratulate the hon. Member for Folkestone and Hythe (Damian Collins) and the members of his Committee on bringing forward an incredibly thorough and very good report. I know Ministers have been consulting well with all Back Benchers, and I hope they do not pay lip service to the report’s conclusions, but really take on its important recommendations. What is interesting about this whole debate is that there is a broad consensus on the Back Benches. None of us are bound by ideology on these issues; our approach is based on our experience, the data and the wide body of research.

I will also say at the beginning that the business model of the platforms means that they will never tackle this themselves. They make their money by encouraging traffic on their platforms, and they encourage traffic by allowing abusive content to exist there. Their algorithms are there almost to control and encourage more abusive content. The idea that there can be any self-regulation in the legislation to be proposed by the Government is false.

I will draw attention to three sets of issues in the short time available to me. The first, the recommendations on paid-for scams and frauds, has already been discussed. It is ridiculous that user-generated content can be subject to regulation but that paid-for scams and frauds cannot be. Everybody who gave evidence to the Committee, including the Financial Conduct Authority, pleaded for its inclusion. The figure I have is from Action Fraud: 85% of the £1.7 billion lost in fraudulent scams in the past year resulted from cyber-enabled frauds. During the pandemic, this figure of course exploded. Again, there is no incentive for the platforms to do anything about this. They get paid for by the advertisements so they wish to encourage them. Indeed, there is a double benefit in this particular space for them, because the FCA also pays for them to prioritise the legitimate websites over the scam adds, so again self-regulation will not work. I know that Ministers support the proposal, and I hope that they are not swayed by advice that it is not legally possible, as I just do not accept that. I hope that they do not miss this opportunity by way of promises of legislation down the line.

Stephen Timms Portrait Stephen Timms (East Ham) (Lab)
- Parliament Live - Hansard - - - Excerpts

I very much agree with the point my right hon. Friend is making and with the recommendation in the report. I wonder whether she noticed that the Prime Minister told the Liaison Committee in July that

“one of the key objectives of the Online Safety Bill is to tackle online fraud.”

Does she agree that it cannot possibly do that if it misses out scam adverts?

Margaret Hodge Portrait Dame Margaret Hodge
- Parliament Live - Hansard - - - Excerpts

I completely agree with my right hon. Friend on that, and I hope that the Minister will confirm that he will include this in the legislation.

The second issue I wish to raise relates to anonymity. No one wants to undermine anonymity—we all recognise that it is crucial for whistleblowers, for victims of domestic violence or child abuse, and for others—but we do want to tackle anonymous abuse. Sadly, most of the vile abuse that appears online is anonymous, as we have seen in the spreading of disinformation, particularly in relation to the pandemic. I have seen it in my experience, and it really undermines my right to participate in democratic debate. If people paint someone online as being a terrible person, as a hypocrite or as a hateful, wicked woman, which is what they do with me, that person is then not trusted on anything and therefore their voice is shut down in the democratic debate.

What we are all after is not tackling anonymity but ensuring third party verification of the identity of people so that they can be traced if and when they put abusive content online. The proposals that came from the Law Commission, and which one of the four ex-Culture Secretaries who has worked on this issue has diligently pursued, to introduce a new offence to tackle serious online harms more effectively is very important. It is about shifting from content to the effects of the online harm.

My third point relates to director liability. All my experience in working in the field of tackling illicit finance and economic crime demonstrates to me that if we do not introduce director liability for when wrongdoing occurs in the actions of individuals associated with a company, we do not change the behaviour of those companies. Even fines of £50 million are not significant against Facebook’s gross revenue of more than £29 billion. I do not understand why we have to wait for two years to implement director liability, as it could be done immediately. I would be grateful to the Minister if he said that he will implement that.

The last thing I should say, in my final seconds, is on anonymity. I would like the Minister simply to confirm this afternoon whether he will tackle anonymous abuse and put in place the Law Commission’s proposals. When is the timeframe for that? I very much welcome the report and commend all those who worked so hard to put it together, and I hope we can make progress swiftly on a problem that is growing in British society and that is undermining, not supporting, democracy.

15:13
Julian Knight Portrait Julian Knight (Solihull) (Con)
- Parliament Live - Hansard - - - Excerpts

I congratulate my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on securing this debate, which is clearly sparking enormous interest. I welcome the majority of the Joint Committee’s recommendations. Indeed, they very much build on the work already carried out by the Select Committee on Digital, Culture, Media and Sport over recent years. When the big tech giants were in their infancy, the Select Committee, which I am proud to chair today, was already leading on some of this work. The Select Committee has been scrutinising the online harms White Paper over the past year and is continuing to do so, and it will be coming up with its own recommendations shortly. The Joint Committee’s report even acknowledges the ongoing work of the Select Committee by stating

“the DCMS Committee has maintained its interest on the issue through the work of its Sub-committee”

—its standing Sub-Committee—

“on Online Harms and Disinformation”.

Let me add to my hon. Friend’s speech by identifying some points with which I agree, but which go above and beyond what he actually said. First, I support the Joint Committee’s work on journalistic content, and its recommendation that existing protections relating to journalistic content and content of democratic importance should be replaced by a single statutory requirement for proportionate systems and processes to protect

“content where there are reasonable grounds to believe it will be in the public interest”.

I also welcome some of the work that the Joint Committee has done in exploring age assurances, building on the work already done by the Select Committee. In particular, it rightly makes several recommendations for Ofcom to establish minimum standards for age assurance technology and governance linked to risk profiles to ensure that third-party and provider-designed assurance technologies are privacy-enhancing and rights-protecting, and that in commissioning such services, providers are restricted in respect of the data for which they can ask.

It is right that the Joint Committee acknowledges the serious threats that misinformation poses to society. In recent months we have witnessed the rise in fake news from the anti-vaccine campaigns as it has hit our social media feeds. I therefore support the recommendation that there should be

“content-neutral safety by design requirements, set out as minimum standards in mandatory codes of practice.”

However, the recommendation that a permanent Joint Committee be established as

“a solution to the lack of transparency and…oversight”

concerns me, and my Committee, for a range of reasons. First, it would go against a long-standing parliamentary convention. Never before has a Joint Committee been established merely to provide post-legislative scrutiny. I know some Members have suggested that a Joint Committee on online harms would have terms of reference mirroring those of the Joint Committee on Human Rights and the Intelligence and Security Committee, but the Joint Committee on Human Rights was certainly never enshrined in the Human Rights Act 1998, and the responsibility of the Intelligence and Security Committee is to provide oversight for policies, expenditure, and operations adopted by MI5, MI6 and GCHQ.

We fear that the creation of such a standing Joint Committee would not only go against parliamentary convention, but would set a bad precedent for many decades to come. If some particularly complex legislation comes to the House in the future, will we just keep on setting up Joint Committees to provide post-legislative scrutiny? Of course we will not—we would be very foolish to do so—but this recommendation sets a precedent for it to happen. When I asked about the cost of establishing the Joint Committee, I was told that it would be £500,000 a year. Moreover, the work is already being done by an elected Committee of the House and a Committee in the other place.

What is the point of establishing another Committee merely to replicate the work that the Select Committee is already doing? If our Committee does need to conduct post-legislative scrutiny of legislation that is particularly complex and groundbreaking, we have a Sub-Committee for the purpose. We recognise the importance of this legislation and this area, and we will continue to scrutinise it through our Sub-Committee and through Standing Order No.152.

I raised this matter with the Leader of the House in my capacity as the Select Committee Chair, and I am grateful to him for his response, in which he said:

“Business Managers and I are of the view that this scrutiny can be arranged through current Standing Orders and that it should not require legislation, nor extraordinary powers, to achieve.”

I know from my conversations with Opposition Front Benchers that they strongly support retaining such scrutiny within current parliamentary procedures, rather than innovating in a way that could be damaging in the long term.

I welcome many aspects of this report, which builds on the Select Committee’s own report, but fine tuning is needed before the Bill comes to the House. My Committee stands ready to issue those fine-tuning exercises, and will do so in the coming days.

15:18
Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Parliament Live - Hansard - - - Excerpts

I congratulate the Joint Committee on an excellent report, consisting of 191 pages of well-researched, balanced, temperate and intelligent analysis and recommendations. It is rare to find such qualities when it comes to subjects as important as online harms and, indeed, technology in the society of today. I will not go through the report’s recommendations in detail because I do not have the time, but also because I support and welcome just about all of them. I will mention, for example, the design for safety recommendation, which I think is excellent, but it is one among many excellent recommendations. Instead, I will focus my remarks on why this report needs to be implemented as quickly as possible and what else needs to happen.

I want to start by speaking in praise of technology. I count myself as a tech evangelist. We have to think how many lives have been saved by remote medicine, how many marriages have been saved by not having to argue about the best way to get directions to an event, how much joy has been shared through cat memes or whatever, and how many businesses have been started on such platforms. Technology can and should be a force for good. That is why I went into engineering—to make the world work better for everyone—and my final year project at Imperial College was on a remote alarm to support people who need care in their own homes.

Engineering should be a force for good, but as the report sets out, that is no longer how it is seen. Many of my constituents, for example, feel they are being tracked, monitored, surveilled and analysed, and they feel undermined because they do not want to have to go online to do what they want to do without feeling safe and secure. Self-regulation is broken, as the report says, and it did not need to be this way. Some of us may remember concerns, back when the web started out, that if it was used for commercial purposes, people would be flamed with emails and condemned for trying to advertise or do direct marketing on the web. Somehow, however, the web was captured by those on what I can only describe as the libertarian right, who sought to maintain the lie that technology and the internet were nothing to do with Government, while building monopolistic platforms with more money and more power than most Governments. Their attitude to regulation and Government, as I remember from my days at Ofcom, was often that if they ignored them, regulation and Government would go away.

Now, of course, the tech giants use their immense riches to wield immense power over Governments—whether in opposing workers’ rights in Silicon valley or in delaying and minimising regulation here. In that, they have been all too successful, and I have to say that it was with the support of a series of Conservative Governments who wanted to leave this to the market and believed that the state was too slow or too stupid to regulate to keep people safe, while actively cutting the part of the state designed to do so. That is why, in my view, this Online Safety Bill is a decade too late. These measures cannot be in place for another year—and that is if the Government act in double quick time, which they seem unable to do—which means that it will be 2023 before we have online safety regulation.

Afzal Khan Portrait Afzal Khan (Manchester, Gorton) (Lab)
- Hansard - - - Excerpts

I too want to say how important this work is, and I urge that this Bill is desperately needed. Refuge has found that one in three women have at some time in their life experienced abuse online. I would say that Muslim women in particular experience a triple whammy of race, faith and gender, and Tell MAMA has told us of the 40% increase in abuse against Muslim women during the lockdown. I hope my hon. Friend agrees that the social media companies must be held to account for their repeated failures.

Chi Onwurah Portrait Chi Onwurah
- Parliament Live - Hansard - - - Excerpts

I thank my hon. Friend for that intervention because he is absolutely right: women in particular have been subjected to harm online, and that is one of the reasons why more women and more people from ethnic minorities need to take part in designing and developing the web and platforms in the future.

I think it is really important to recognise that the last Labour Government put in place forward-looking regulation in the Communications Act 2003, which set out the landscape for regulating growing tech companies for the next decade. Given that a series of Conservative Governments have put in place no regulation, that the Bill cannot be in place for another year is a real indictment of them and shows a level of negligence that it is difficult to recover from.

In my last minute, let me just say what we need to be looking towards for the future. The Bill and the report do not reflect the development around web 3.0. We are looking to more decentralisation of the web, which is being reflected in the use of blockchain as part of the future architecture of the web. For some, blockchain is a way of avoiding government. If someone has blockchain, they do not need government. It is that kind of libertarian, right-wing approach to digital, and any Government need to be constantly looking forward to see what regulation will be required. We also need to have more emphasis on people’s rights, on access to algorithms and on their regulation. I look forward to this Bill being in place as soon as possible.

00:01
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

I echo the words of thanks to the Joint Committee and its Clerks, under the excellent chairmanship of my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for this thorough and weighty report. It includes some of the big beasts in the world of online safety, and that is important, because this is one of the single most important pieces of legislation of our time. It is absolutely groundbreaking. It is vast; it is almost five Bills in one, and no country has attempted to regulate the internet so comprehensively.

The pressure is on, not least because we have got into this bad habit of describing this Bill as the calvary coming over the hill for the online world and all the ills it contains. Someone compared it with the motor car, and it has taken decades of legislation to address the safety issues of that evolving technology, so we will never do this in a oner, but this Bill needs to be the very best possible starting point—the foundations to face the current threats, but also the challenges of the future.

I lived with this Bill for 20 months; I can talk about it forever, but I will not. Let us start at the beginning with the algorithms. We have all seen them. We start watching cute videos—in my case, it is usually of babies falling asleep in their own food, but that is probably just me—and immediately we are swept into this rabbit hole of suggested content, and it is designed to keep us engaged as long as possible, because that is where the money is: to capture our attention and sell it to the highest bidder. Do not forget, if we are not paying for the product, it is most likely we are the product.

It gets more sinister than that, though, because that same algorithm that is sending me those cute babies is recommending self-harm to a vulnerable teenager or spreading wildly dangerous disinformation about the dangers of covid. Algorithms are echo chambers. They take our fears and paranoia and surround us with unhealthy voices that reinforce them, however dangerous and hateful.

According to the 2020 Netflix documentary “The Social Dilemma”, former employees of the largest social media companies who were integral to the early development of those algorithms say that addiction is built into the design. Many of them say that the platforms are so unhealthy, they would not let their own kids anywhere near them. As the report says, tackling these design risks at source is more effective than just trying to take down individual pieces of content. We saw that with the outbreak of covid, when 5G masts were burned to the ground because of some wild conspiracy theory that suggested they were the root cause of covid. 3G and 4G masts were also destroyed, because it turns out these people are not wildly bright and cannot tell the difference. We cannot censor this stuff, because that just fuels the sense of a conspiracy theory of state conspiring. We need to stop it before it is force-fed into people and ensure that there is balance. Platforms must tackle the design features that exacerbate the risk of harm, and the legislation should include a specific responsibility for them to do it and for the regulator to enforce it.

I want to talk quickly about a couple of the specifics. There cannot be a Member of the House who has not supported a constituent devastated by online fraud. It is growing exponentially, and there is almost universal agreement that the legislation should address it. That is why we changed things from the White Paper to the draft Bill, but I agree with the Committee that the measure should be strengthened, and it should also be extended to cover paid-for advertising.

Child protection has always been a cornerstone of the Bill. I have no doubt that social media platforms are where the volume is, in terms of both content and people, and in practice it is where very young children are most likely to stumble over really unpleasant content. However, it is not enough to include only user-generated content. The Bill’s credibility will be undermined overnight if the largest commercial pornography providers can keep hosting extreme content and putting children at risk. I would therefore like to see the Bill extended, in line with the age appropriate design code. That would be a really good way of dealing with that, as the report suggests.

On categorisation, no doubt big platforms and search engines are where the volume is, but the digital world changes at lightning pace and trends go viral overnight. Risk should not be judged on size—it must be judged on risk. Emerging platforms can be hotbeds of extremism and really unpleasant content, and they must be appropriately regulated.

A final quick note of caution about the report’s all-or-nothing tone. It makes great suggestions that would strengthen the Bill, but that has been years in the making. I did a tiny bit of it. It has involved many Ministers and a team of fantastic officials, many of whom have worked on it from the beginning. The Bill is like a huge, complicated tapestry: you pull one thread and others can unravel further down the line. The online world is so fast moving—it is evolving at a rate of knots. We have to think carefully about how we change the Bill. Otherwise, it will be obsolete before the ink is dry.

15:31
Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- Parliament Live - Hansard - - - Excerpts

I speak as chair of the all-party parliamentary group on suicide and self-harm prevention and intend to deal with those issues. Suicide remains a major public health problem, with the highest suicide rates among men aged 45 to 49. It is the biggest killer of young people aged 16 to 24 and the suicide rate for young women is now at its highest on record.

While suicide and self-harm are complex and rarely caused by one thing, in many cases, the internet is involved. A 2017 inquiry into suicides of young people found suicide-related internet use in nearly 26% of deaths in under-20s and 13% of deaths in 20 to 24-year olds. I therefore welcome the Committee’s report, in particular its recommendation that encouraging or assisting suicide be included in the primary legislation as a priority illegal harm.

Self-harm, which is a major risk for suicide, is also becoming much more prevalent, having tripled among young people in the last 15 years. One in 15 adults in England report that they have self-harmed. I also welcome the Committee’s recommendation, in line with the Law Commission’s, that encouraging or assisting a person to seriously self-harm should be made illegal.

While some harmful suicide content is illegal—and some self-harm content could be in future—there will be suicide and self-harm content that can be distressing, triggering and instructive, that can act in part to maintain or exacerbate self-harm and suicidal behaviours, and that is legal but harmful. Will the Minister confirm that suicide and self-harm will also be included as a priority legal but harmful content in the final Bill?

Samaritans has a longstanding concern that the draft Bill lets smaller platforms such as online community groups, forums and message boards, where some of the most harmful suicide and self-harm content can be found, completely off the hook, particularly when it comes to protecting adults. The Committee received written evidence from people who contacted Samaritans to share their lived experience of suicide. One respondent wrote:

“The people using the bigger sites will just flood the smaller sites if their content starts getting removed. The standard needs to be the same across all sites.”

Another wrote:

“If suicidal people can’t find what they are looking for at large sites they will just go on to the smaller sites so it doesn’t solve the problem.”

Eighteen years old is not a cut-off point for experiencing suicidal ideation, which can fluctuate over the course of a single day. Those who experience that are more likely to be more vulnerable and at greater risk of harm from legal but harmful suicide and self-harm content.

Another respondent with lived experience told the Committee:

“Harmful and accessible suicide and self-harm online content can be harmful at any age. I am in my fifties and would be tempted to act on this information if I felt suicidal again”.

Furthermore, it has been unclear whether Wikipedia, where some of the most harmful content can be found, would be in scope of the legislation. I therefore welcome the Committee’s recommendation that the categorisation of services in the draft Bill be overhauled and that all platforms consider the risk that their service poses in relation to children and adults.

There is a clear imperative to tackle suicide and self-harm content online. Taking a partial approach to such content will undermine the UK Government’s efforts to prevent suicide and to achieve the aims of the cross-Government national suicide prevention strategy in England. A key aspect of suicide prevention is the reduction of access to means, and reducing the availability of harmful and instructional information is one way of achieving that.

No caveats around tackling harmful suicide and self-harm content—size of platform, age of user—should be established that will diminish the legislation’s ability to tackle harmful content in this space. Can the Minister confirm in relation to suicide and self-harm content that all platforms and people of all ages will be in scope in the final Bill presented to this House?

15:35
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Parliament Live - Hansard - - - Excerpts

Back in 2013, when I was Secretary of State for Culture, Media and Sport—I am one of at least two former Culture Secretaries in this debate—I visited a number of social media companies in Silicon Valley. I spent time looking at their ethos, priorities and vision, and I rapidly came to the conclusion that statutory regulation would be absolutely essential for the sector in the UK; I pay tribute to my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) for taking that forward.

I welcome the Joint Committee’s work on this issue. The report is absolutely right to start by saying that self-regulation of online services has failed. It has failed to ensure the design of products that are safe to use, that products are age-appropriate and that content is properly monitored and moderated—hygiene factors in almost any other sector in this country.

When she gave evidence to the Joint Committee, the Secretary of State was completely right in saying that the first priority has to be to end online abuse. Regulation will only ever be part of that, as my hon. Friend the Member for Gosport (Dame Caroline Dinenage) said: there are many other elements to the issue. I would like to touch on a couple of those and underline some of the points in the report.

Better law that properly recognises new types of crime in the online world has to be a priority of the Government now—they have gone a great deal of the way towards recognising that following the Law Commission’s reports, but there is much more to do, particularly given the violence against women and girls strategy and its acknowledgement that online abuse disproportionately affects women. The Joint Committee’s report touches on this issue, but I would go further.

The Government have already recognised the need to criminalise cyber-flashing and deep fake, but the development of such things as nudification software shows how sickeningly ingenious the sector is in inventing new and different ways to perpetrate harm. When it comes to online harm, we need to make sure that the law is better and future-proofed. As well as putting current laws into the Bill, will the Minister produce a schedule of other areas against which the Government intend to legislate—particularly in the area of intimate image abuse—to make sure that all the law is up to date? The regulator needs a strong criminal law in place if it is to be effective.

I underline the issue of individual redress. I whole-heartedly support the Joint Committee recommendation on reporting mechanisms and the idea of an online safety ombudsman; that will be really important to make sure that the Bill does not appear to our constituents, the people who suffer online abuse, as something remote and not practically helpful in their day-to-day life online. I would go one step further and make sure that victim support groups are also well funded because many more victims will be coming forward.

A number of right hon. and hon. Members have touched on the issue of anonymity. We know that the Bill is silent on that issue at the moment, and the Joint Committee is right to recommend changes. I do not fully agree with its recommendations; in my view, the evidence shows that anonymity creates an atmosphere of impunity for those who are abusive. Polling by Compassion in Politics shows that more than 80% of social media users would provide evidence to get verified accounts, and making verified accounts the default option—not the only option—would help to stop some of the abuse. It would not stop all of the abuse, of course, but it would help to change the ethos.

The Joint Committee’s report underlines the importance of having evidence, as the lack of transparency and reporting makes scrutiny of the impact of social media very difficult. I wholeheartedly support the idea of ensuring better, clearer and more transparent reporting.

A self-regulated online world has failed. Statutory regulation is the only way forward, but we need to encourage others around the world to follow suit. I applaud the Government for their approach, and I would add to the report’s recommendations by saying that the Government should use their work in developing strong relations around the world and in establishing new trade agreements to discuss the stake we all have in a safe and healthy online world.

15:40
Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- Parliament Live - Hansard - - - Excerpts

I am grateful for the opportunity to speak in this important debate. I support the Joint Committee’s work, and I am grateful to the Chair, the hon. Member for Folkestone and Hythe (Damian Collins), and the members of the Committee for their efforts in this important area.

I would like to raise the case of a young boy from Reading, Olly Stephens, who was killed in a most brutal attack, a knifing, in a park where he was lured through social media. I hope to set out some of the concerns of his family and our local community about this dreadful crime.

I pay tribute to Olly. He was just 13 when he died, and he had his whole life ahead of him. He was a livewire at school and a likable boy. It is simply impossible to imagine what his parents and his family are going through. It is now a year since he passed away, and they had a memorial service at the beginning of January that was incredibly moving and very difficult. My heart and the hearts of people in the local community go out to the Stephens family at this incredibly difficult time.

I thank Stuart and Amanda Stephens and, indeed, the community as a whole for their campaigning work to try to raise awareness of knife crime—[Hon. Members: “Hear, hear!”] Thank you. And to raise awareness of the connection between knife crime and social media.

The background to the attack and the way in which it involved social media is very clear and quite shocking. First, Olly met the two boys who killed him online—that was the connection between them. Secondly, and most crucially of all, he was lured to the park where he was stabbed. A girl sent him a message online asking him to come to the park. She had separately messaged other young people asking someone to stab Olly. This was on a social media platform, and you can imagine how awful it is.

The third point that is important for us to consider today is that the boys who killed Olly—they were very young teenagers—were using 11 different social media platforms, and they were sharing images of knives. Imagine teenagers flicking and playing with knives in their bedroom, videoing it and putting that shocking content up on social media. None of those 11 platforms took down that content. That is the level we are talking about, which is why I urge the Minister particularly to address the connection between knife crime and social media. I am sure he will respond on that point.

Once again, and I hope the Minister will be able to reply in detail, I call for action from the Government on behalf of Olly’s parents and on behalf of the local community in Reading and Woodley. I know some of these points are in the report, but I would particularly like the Minister to address the importance of age restrictions, the importance of ending anonymity online and the importance of forcing companies to take down harmful content. How can it be right that powerful and very wealthy companies are allowed to put clearly dangerous content online, such as content featuring knives, and not take it down immediately?

Finally and crucially, because this was apparently also a factor in the case, we must ensure that companies co-operate with the police. I want to look into this further, but I understand it is possible that the social media company where the message was shared inciting this criminal act did not fully co-operate with the police. I want to hear more about it, but I understand that is a possibility. I therefore ask the Minister also to ensure that companies operating in the UK are regulated in the UK, so that we can protect our young people from this dreadful form of crime.

In conclusion, I am grateful to the Chair of the Joint Committee and his team for their work on this important matter, and I also thank the Minister and my colleague the shadow Minister for their work. I hope that we can now move forward together to tackle this awful abuse and other forms of crime.

15:45
Jeremy Wright Portrait Jeremy Wright (Kenilworth and Southam) (Con)
- Parliament Live - Hansard - - - Excerpts

It is a pleasure to follow the hon. Member for Reading East (Matt Rodda), who I think illustrated clearly why this Bill matters. I want to joint the chorus of warm congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and his entire Committee on their remarkable work in producing such an impressive report on what is a very complex Bill. They have done the House a huge favour by suggesting ways in which the Bill could be made more straightforward and more focused on its overall objectives.

The Bill covers unlawful content as well as legal but harmful content, and it is in the latter category that the definitional challenges apply. Of course, we see in that challenge a conflict between specificity and flexibility. The legislation and the regulation that we create need to be specific enough that those subject to it know what they have to do, but flexible enough to keep up with what is a changing online world. The overarching duty of care set out in the initial White Paper was designed to give that adaptability and to encourage proactivity on the part of platforms in identifying and responding to emerging harms, and there is no doubt that that needs refinement.

I think that the Committee’s recommendation that there should be a requirement to have in place proportionate systems and processes for identifying and mitigating reasonably foreseeable risks of harm arising from regulated activity defined under the Bill is largely an elegant way to square that circle and keep some sense of control in this place of what harms we are content for the regulator to act against. However, I do not think that the Committee would claim that this is the last word on the subject, and nor should it be, because there are inherent risks of inflexibility—legislating to change harms is cumbersome and time-consuming.

There is also a risk of inconsistency, even with the Committee’s approach elsewhere. I am thinking of the Committee’s approach to defining content that is harmful to children, which it defines as content or activity that is either specified on the face of the Bill or in regulation, or—this is the crucial bit—where there is a “reasonably foreseeable risk” that it would be likely to cause harm. In other words, there needs to be some flexibility to oblige platforms to deal with harms that are not defined in regulations or in the Bill as they emerge in a fast-changing landscape, and I think that needs to be reflected more broadly too.

David Johnston Portrait David Johnston (Wantage) (Con)
- Hansard - - - Excerpts

My right hon. and learned Friend makes an important point about flexibility. I wonder whether he is familiar with the work of the Epilepsy Society, which has found that people with photo-sensitive epilepsy are being sent flashing images in the hope that that might induce a seizure. Does he feel that that type of harm ought to be incorporated in the definitions in the Bill as well?

Jeremy Wright Portrait Jeremy Wright
- Parliament Live - Hansard - - - Excerpts

Sadly, I am familiar with the activity that my hon. Friend describes. Of course, it is quite possible that such activity is unlawful, in which case it may well be covered in that part of the Bill. If it is not, we must ensure that there is sufficient flexibility to cover it elsewhere.

The conflict between flexibility and specificity appears elsewhere too. The Committee is right to say, as was my hon. Friend the Member for Gosport (Dame Caroline Dinenage), that the categorisation approach to differentiating platforms that present greater harm from those that present lesser harm is too blunt an instrument. We need a more sophisticated approach based on risk profile, as the Committee says—one that recognises that risks can emerge from unexpected places and with which we can see small platforms becoming influential very quickly.

I also think that the Committee is entirely right to seek to change the emphasis of the Bill away from solely content and towards activity and systems too, because ultimately it is the ordering, promoting and manipulation of content that is the root of the problem, and that is what the Bill should seek to address. Transparency will of course be crucial in enabling the regulator to do so.

It is also right to highlight, as the report does, what needs adding to or bolstering in the Bill, whether it be anonymity, the management of end-to-end encryption, misinformation and disinformation, or age assurance and verification, which others have spoken about. There are, as the Committee says, other changes needed to the Bill. The structure at the moment is heavily dependent on risk assessments that platforms themselves conduct. There is no provision at the moment for the regulator to do something about those risk assessments being profoundly inadequate, whether by accident or design, and there clearly needs to be.

We all agree that Ofcom needs adequate sanctions to be taken seriously. I welcome, as the Committee does, the Government’s indication that they will bring forward more quickly individual director liability for information offences—in other words, failures to give Ofcom information about what is going on. We need to recognise that out of that arises the potentially fairly ludicrous situation in which an individual director might be engaging in the most appalling conduct, but so long as they are honest with the regulator about it, they are okay. That cannot be right and that is why I think the Committee is right to identify the need for an additional offence to deal with egregious conduct by directors.

The balance between parliamentary oversight and the operation of ministerial discretion in the Bill is, frankly, in the wrong place. There is too little of the former and too much of the latter. Power for Ministers to amend codes of practice in order to reflect Government policy is a particularly chilling potential infringement on the independence of the regulator. That needs to be repaired.

The final point I want to make is this: when we approach a Bill not just of this complexity but with the groundbreaking nature that my hon. Friend the Member for Gosport described, we need to do so with humility. We may not get everything right first time and there is no monopoly of wisdom. There is no example for us to look to internationally and the rest of the world is looking at us to do this first. I take the point made earlier about delay, but we are still doing it first. We need to get it right and I hope the Government will approach it with an open mind.

15:51
Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- Parliament Live - Hansard - - - Excerpts

I, too, want to thank the Joint Committee for its very thorough report and for the recommendations it has made to the upcoming online safety legislation. As has already been mentioned, bringing our legislation on harmful online content into the 21st century is long overdue.

Today, I want to speak about a very particular but sadly prevalent harm: violence against women and girls. The year 2021 was a watershed moment for violence against women and girls, at least in terms of discussion around the extent of the problem. I thank each and every one of my constituents in Bath who wrote to me about these issues. We need those conversations about online violence against women and girls. The experiences of women and men online are often very different. Gendered harms are endemic. The Government have a responsibility to recognise and take meaningful steps in the Online Safety Bill to reduce those harms. I share the Joint Committee’s concern that any Bill aimed at improving online safety that does not require companies to act on misogynistic abuse would not be credible.

I welcome the Committee’s recommendation that cyber-flashing should be made illegal. The right hon. Member for Basingstoke (Mrs Miller) has already touched on that issue. Cyber-flashing is a particularly prevalent form of online violence against women and it disproportion- ately affects young women and girls. Some 76% of girls aged 12 to 18, and 41% of all women, have reported being sent unsolicited penis images. According to the dating app Bumble, 48% of millennial women said they had been sent an unsolicited sexual image in the last year alone. Like real life flashing, cyber-flashing can frighten. It can humiliate. It can violate boundaries. It is a form of sexual harassment from which even the physical boundaries of a home offer no respite. In the words of one woman who shared her experiences, cyber-flashing is “relentless”. It can cause many women to police their online activity to avoid receiving those types of images.

All too often the trauma that women experience is trivialised. Unlike in Scotland, there is no effective route to prosecution for cyber-flashing for those who experience it in England and Wales. I urge Ministers to use the Bill to close this loophole in the law. This issue, at its heart, is about consent, and consent is the principle on which new a cyber-flashing measure should be based. A consent-based approach focuses on the core wrong and makes it easier for recipients to report instances of cyber-flashing.

Nearly four years ago, I presented my upskirting Bill to the House. I argued then that upskirting should be an offence regardless of the motivation of the perpetrator. That is because upskirting causes significant harm to the victim regardless of the intentions behind it. The same is true of cyber-flashing. It was the approach taken in Texas where it is now illegal to send unsolicited genital images. I recommend Bumble for its work on this campaign, and I hope that Members across the House will add their support to it, so that we can replicate the approach in England and Wales that has been taken by Texas. Various Ministers have now signalled their support for criminalising cyber-flashing. That is very welcome, but the Government must act without delay.

As digital spaces become an ever greater part of our lives, we must ensure that an increasing number of people have a route to justice.

15:55
Matt Warman Portrait Matt Warman (Boston and Skegness) (Con)
- Parliament Live - Hansard - - - Excerpts

Let me start by commending the work of the Joint Committee. I do so in large part because it reflects the fact that, thankfully, there is precious little partisan politics in this area. There is huge agreement across the House.

I congratulate my hon. Friend the Minister. He is the latest custodian of what I fear will become the largest Christmas tree Bill in Parliament. We run the risk of presenting this piece of legislation as something that will fix the entire internet. When I was the Minister with responsibility for this matter, I felt two possibly conflicting things. The first is that our guiding and most important principle is that what is illegal offline should be illegal online. There are huge parts of this Bill where that need not even be a conversation. The cyber-flashing example is an interesting one, because flashing is illegal in the real world. The idea that it might not be illegal online is absurd. We should not even be having that conversation. There are many pieces of this Bill where, in fact, what is required is simply a tidying up exercise, reflecting the fact that our legislation has not kept pace with the changing nature of the digital world.

The second feeling that I had was that, in many cases, we had existing laws that did not even need as much modification as perhaps we might think. There is an important issue of the existing resources that the police allocate to online criminality. That is not to denigrate the fantastic work that the police do—of course it is not—but, too often, we will have had constituents who have gone to their local police forces and found that online crime is treated fundamentally differently. The Home Office is on board with addressing that, but it does need to change.

I am, however, fundamentally optimistic about what this Bill can achieve. We are now all agreed that it is only right that elected people regulate the public square, and the public square now firmly includes Facebook and Twitter, so it cannot be right that Mark Zuckerberg has more power than my hon. Friend the Minister, or, indeed, the Prime Minister of many countries. That cannot be right, and this Bill goes a long way to fixing that.

I want to touch on a couple of specific points. The first is that it is plainly also right that we should be regulating advertising at the same time as we are regulating other content. There has never been a doubt in the Government’s mind that that should happen, but, importantly, I know that there is some ambition to align the timetables of both of those pieces of work.

Likewise, the place of journalism online is incredibly important. We can overthink who is a journalist in the modern world, but we should be able to make some sensible progress on whether the self-regulated journalism that we have in this country adopts a different status online, and Ofcom, in its work in this Bill, should reflect that.

My final point on those specific issues is that I know that my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and former Secretaries of State who spoke in the debate wanted the primacy of free speech of the individual to be protected in all the work that we have done in this Parliament. I know it is difficult to embed that in a Bill, but we must address the fact that, just as large numbers of people are—rightly or wrongly—genuinely hesitant about taking the vaccine, large numbers of people genuinely worry that the Bill will allow serious constriction of free speech online. For me, that potentially has a chilling effect, which we should all be concerned about.

I commend the Minister for how he has engaged with colleagues across the House. We need to communicate better about how the Bill will tidy up some aspects of the imperfections that I have mentioned and make a real change and difference to how people experience social media. Fundamentally, we need to be clear that the Government’s commitment to protecting free speech runs like a golden thread through the Bill and that the measure will not potentially undermine that. None the less, I think we all agree that the Bill will perform an essential function in an area that is essential to all aspects of modern life.

16:00
Stephen Timms Portrait Stephen Timms (East Ham) (Lab)
- Parliament Live - Hansard - - - Excerpts

I congratulate the hon. Member for Folkestone and Hythe (Damian Collins) and his colleagues on the Committee on their work. In particular, as he knows, I warmly welcome the recommendation that paid-for advertising should no longer be excluded from the scope of the Bill.

My hon. Friend the Member for Tooting (Dr Allin-Khan) put me in touch with one of her constituents who had some experience of online scamming. He explained that he is an experienced fund manager who used to run his own investment management firm. Last February, he clicked on a link that offered suspiciously high returns because he wanted to see what the scam was. Over the following months, despite repeatedly telling every caller that he was not interested, he was subjected to a daily barrage of calls, which petered out only in October. He said:

“When I started challenging them after a couple of months, they started becoming abusive… threatening to sue me for slander when I pointed out what they were doing was illegal…It actually became very stressful…Having warned Google many times, it fails to take action.”

We now have a well-established organised crime industry staffed by a large number of accomplished thugs, sustained by cheap and easy access to victims on Google and on Facebook. My interest arises, as the hon. Member for Folkestone and Hythe mentioned, from the Work and Pensions Committee inquiry on pension scams.

In a letter to the Work and Pensions Committee last May, the chief executive of the Financial Conduct Authority told us that

“fraud now accounts for one-in-three crimes in the UK, costing up to £190 billion a year. An estimated 86% of fraud is committed online…Action Fraud has told us that victims of pensions-related scams who had worked their whole lives to build a retirement fund had lost £82,000 on average…Online platforms, such as search engines and social media platforms, are playing an increasingly significant role in putting consumers at risk of harm by exposing them to adverts for financial products…Fraudsters have unprecedentedly cheap access to an online population of consumers who find it difficult to differentiate genuine offers from the fraudulent…There are ads online for firms that don’t exist, for firms that claim to be regulated but aren’t, for firms that claim to be based in the UK but aren’t and for clones of legitimate authorised firms.”

That is why I applaud so warmly the Joint Committee’s recommendation that the Bill should be broadened, as the Governor of the Bank of England said, to cover online fraud. I welcome the support for that move expressed in the debate.

I referred earlier to the Prime Minister’s statement to the Liaison Committee last July that

“one of the key objectives of the Online Safety Bill is to tackle online fraud.”

However, the current draft of the Bill excludes most of the online fraud problem, so I urge the Minister to tackle it head on in the Bill. The public certainly want that; Aviva has published research concluding that 87% of the public want the Government to legislate to stop search engines and social media platforms promoting financial scams through advertising.

Until now, Ministers have said that the problem will be addressed by separate work on online advertising. That really is not enough. That work is proceeding at a snail’s pace. In February 2019, the Department for Digital, Culture, Media and Sport announced that it was considering the regulation of online advertising, but three years on, there has been no progress at all. We now understand that there will be another consultation later this year. It will be years before that work delivers, and in the meantime thousands will have lost their life savings and UK financial services will have suffered further damage. The Government must not surrender to organised crime. I urge the Minister to accept the Joint Committee’s recommendation.

16:05
Suzanne Webb Portrait Suzanne Webb (Stourbridge) (Con)
- Parliament Live - Hansard - - - Excerpts

It was a privilege to serve on the Joint Committee, which was chaired admirably by my hon. Friend the Member for Folkestone and Hythe (Damian Collins). It was a cross-party, cross-House experience to which we brought a wealth of our own experience, and the report seems to have been warmly received.

There is no doubt that the Committee’s experience of listening to hours of harrowing testimony and reading the written evidence was truly humbling, but I have always questioned why we actually need the Bill—that was my constant narrative all the way through. As we trawled through the written evidence and listened to hours of harrowing verbal evidence, it was unclear to me why the tech companies did not remove harmful content at the first opportunity and monitor their systems. They should be doing that in the first instance.

Those systems cause so much pain and upset. They have led to insurrection, to prosecutions, to people being robbed of their hard-earned money and to people dying. The problem that our Committee faced is that many tech companies are now bigger than a single news agency—arguably than any Government, for that matter—and have a monopoly on people’s thoughts and beliefs, driven by algorithms that are driven by immense profit.

In the time that I have, I will focus on the governance element of our recommendations. Robust regulatory oversight will be critical to ensuring this Government’s ambition for us to be one of the safest places online in the world. To put in context why that is so important, let me explain about killer algorithms.

An algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. My hon. Friend the Member for Gosport (Dame Caroline Dinenage) touched on the point that an algorithm can constantly recommend pictures of dogs to dog lovers like me, but the dark side is that it can also constantly recommend to a vulnerable teenager pictures of self-harm, suicide content, violent sexual pornography or unsolicited contact with adults they do not know, right the way through to more insidious harms that might be built up over time.

We heard the sad story of the suicide of Molly Russell from her father during the evidence sessions. She was a 14-year-old who killed herself after viewing images of self-harm and suicide online. The coroner heard that in her last six months she used her Instagram account more than 120 times a day, liked more than 11,000 pieces of content and shared over 1,500 videos. An inquest is examining how algorithms contributed to her death.

During the evidence sessions, we also learned of Zach Eagling. Gorgeous 10-year-old Zach has epilepsy; I have had the privilege of meeting him. He was subject to the most deplorable and deliberate practices targeting epilepsy sufferers with flashing images.

Those were two of the stand-out moments that broke my heart during the evidence sessions. Why were the tech companies not stopping these killer algorithms? Why did they allow this to happen? In principle, tech companies self-regulate already, but they have failed. Lack of accountability, combined with commercialisation, has created a perfect storm in which social media can literally kill, so the natural conclusion is that tech companies must be held liable for systems that they have created to make money for themselves and that have had harmful outcomes for others.

Our report recommends compelling service providers to safeguard vulnerable users properly and regulate illegal content. For me, the key recommendation is

“that a senior manager at board level or reporting to the board should be…made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.”

Let me use the case of 10-year-old Zach. This is what it would mean to him if our recommendations were accepted: sending flashing images to epilepsy sufferers would become a criminal offence.

The human cost of the internet is unquantifiable, and I applaud the Government for what will be a ground-breaking and truly world-leading Bill. Our recommendations will ensure that the Bill holds platforms to account and achieve the Government’s aim of making the United Kingdom the safest place in the world to be online. We owe that to Molly Russell and to Zach Eagling. The tech companies should be removing harmful content and enforcing safety by design at the first opportunity. Surely they do not have to wait for the Bill; they can do the right thing now.

16:10
Carla Lockhart Portrait Carla Lockhart (Upper Bann) (DUP)
- Parliament Live - Hansard - - - Excerpts

Like other hon. Members, I place on the record my sincere thanks to members and staff of the Joint Committee for their invaluable work. The Bill has huge potential for good and is so desperately needed. The Committee can take satisfaction that the recommendations in the report would most certainly assist in achieving the Government’s aim or stated objective of making the UK the safest place in the world to be online.

I will address some issues around the anonymous abuse that some describe as “legal but harmful”—the report offers some very constructive proposals to address that—as someone who has been on the receiving end of torrents of such abuse over many years, and in solidarity with a colleague and friend, Diane Dodds, who is a Member of the Northern Ireland Assembly, a former Member of the European Parliament, and the wife of the Lord Dodds of Duncairn. In his time in this place as the Member for Belfast North, Lord Dodds was a tireless campaigner for the fortification of flour with folic acid. Many will know that he did so as a father who had lost his beloved son Andrew, who was born with spina bifida and passed away aged eight.

Over the new year period, Diane shared a post on Twitter. In it, she was pictured with her two dogs, and extended best wishes to followers for the year ahead. An anonymous troll responded:

“Nice looking dogs, have they taken the place of your dead son?”

I know that you, Madam Deputy Speaker, and Members across this House, will share my revulsion that such a vile and callous remark was made to a mother who still grieves the loss of her child. Yet when the comment was reported, Twitter’s initial response was that it did not violate the rules. Only after three days of significant media attention did Twitter change its stance and suspend the account.

Will the Bill address that form of online abuse? Will it lift the cloak of anonymity and extend the veil of protection to people being attacked and abused online in such a callous, vindictive and cruel way? In that regard, I believe that the report of the Joint Committee and its recommendations make for better legislation and better protection for users online by rightly addressing issues with the lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, the lack of user control over accounts they engage with, and the failure of platforms to deal with abuse. I endorse all the report’s recommendations in that regard.

I bring particular attention to the proposal that the higher-risk platforms such as Facebook and Twitter allow that choice of verified and non-verified accounts to be offered to users, and then subsequent options for the interaction of the two. Such an option would not only protect people like Diane; it would protect the schoolchild who is being abused because of their image. Verification is key and must be legislated for. Otherwise, the Bill is toothless.

The report also deals with the rather disturbing trend of multiple account creation and the use of such accounts to spread a message in a short time. In Northern Ireland, we see the method in action daily; it is used by foot soldiers or keyboard warriors of one political party to spread their message and falsehoods, and with the more sinister motive of silencing opponents. In our democracy, we need protection from such tactics, and I commend the Joint Committee for its proposals on that.

I wish briefly to mention the joint report and its approach to pornography. I believe that the Joint Committee should have offered a more robust approach to improving the legislation and safeguarding children in particular from the serious harm caused by pornography. First, if the report’s suggestion of replacing clause 11 of the Bill with a list of online harms is accepted by the Government, pornography should be listed as an online harm. For many of us in this House it was troubling that the Government in 2019 abandoned part 3 of the Digital Economy Act 2017, leaving children with no protection online.

Secondly, it is welcome that the Joint Committee has proposed age verification through the age-appropriate design code. Two concerns remain: whether the provision is as robust as the Digital Economy Act, and the timeframe for enacting it and the children who will consequently stumble on this type of material. In the interim, the Government must implement part 3 of the 2017 Act, and then the design code and this Bill can be used to offer greater protection.

16:15
Nickie Aiken Portrait Nickie Aiken (Cities of London and Westminster) (Con)
- Parliament Live - Hansard - - - Excerpts

This is a comprehensive, thoughtful and constructive report, and I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and his Committee for their tireless work.

Generally speaking, I welcome the work to tackle online abuse. In particular, I note the contribution made by my hon. Friend the Member for Stroud (Siobhan Baillie) and the vital work on preventing cyber-flashing by my hon. Friend the Member for Brecon and Radnorshire (Fay Jones). In my previous contributions on this subject, I have noted some horrific examples of antisemitic abuse, which for me underscore the importance of what this Bill will do. We cannot continue in a world where there are nearly two antisemitic tweets for every Jewish person in the UK. Measures to tackle that are central to the Bill and rightly take pride of place within it.

However, in the time I have available I will speak on an issue that has not always been front and centre of the debate: online fraud. City of London Police, based in my constituency, is the national lead for economic and cyber-crime and for fraud. Its contributions to the Committee, alongside those of the Office of the City Remembrancer, cannot be overstated. It was made clear in the Joint Committee report that fraud and cyber-crime are on an upward trajectory, affecting more people more often than any other crime today.

We know that fraudsters are increasingly sophisticated. They are always looking for the next chink in our digital armour, so I am glad that paragraph 186 of the report makes it clear that we need to act on the human consequences of online fraud—not just the financial effects, but the psychological effects. Unfortunately, I suspect that the covid pandemic may have been one of the triggers for the increase in sophisticated cases of, for instance, online romance fraud.

It is therefore right that we have moved beyond the scope of the White Paper to include fraud in the Bill. Now we must ask ourselves how we as legislators can effectively tackle online fraud in this Bill, recognising that acknowledging online fraud is a first step, but behind that acknowledgment there must be robust recommendations and proposals to ensure safety. This is not about paying lip service to stopping online fraud. The forthcoming Bill cannot stop at being simply reactive, and the Committee is right that any measures to counteract fraud must prioritise prevention.

It is not enough for providers simply to undertake a risk assessment for fraudulent content and take down that content when reported. To combat online fraud effectively, we need legislation that requires platform operators to be proactive in stopping fraudulent material in the first instance—not simply removing it when people tell them about it. I welcome the recommendations that clause 41(4) be amended to add an offence of fraud and similarly that related clauses be introduced or amended so that companies are required to address it proactively.

Alongside that, the draft Bill’s proposals explicitly exclude paid-for advertising from the scope of the legislation, which would undermine any meaningful effort to properly combat all online fraud in the Bill and potentially create a loophole for criminals to circumvent legislation. As such, I support the Committee’s recommendations in paragraphs 268 to 271 of the report, which would bring such advertising into scope. This would make sure that Ofcom is responsible for acting against service providers who consistently allow paid-for advertising that creates a risk of harm to their platforms.

I know that the Government will respond with strength on this issue, and I welcome the Minister’s meetings with me to discuss it in full. I am glad to see such widespread support in this place for the report. I am in no doubt that we need to protect our citizens against aggressive and malicious abuses of technology. It is now or never.

16:20
Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Parliament Live - Hansard - - - Excerpts

I congratulate the whole Committee on this incredibly thorough and useful report. I am sure that it will play a key role in improving this legislation.

I want to focus my remarks on suicide and self-harm. The reason for that is the tragic case of a young man from my constituency, Joe Nihill, who took his own life at the age of 23 after accessing suicide-related information on the internet. I have raised this with the Secretary of State before in DCMS questions, and I know that she feels very keenly the need to tackle such content online. I again pay tribute to Joe’s mother Catherine and his sister-in-law Melanie, who have been running an inspiring campaign. Their campaign was inspired by the fact that Joe, in the note that he left to his family before tragically taking his own life, asked for action on this kind of online content.

I welcome the Committee’s report, particularly its recommendation that encouraging or assisting suicide is included in the primary legislation as a priority illegal harm. That is really important. I also welcome the Committee’s recommendation, in line with the Law Commission’s recommendation, that encouraging or assisting a person to seriously self-harm should be made illegal.

The two issues where I would like the Government to improve the Bill in relation to suicide and self-harm relate to the size of the platforms covered and the age of the people protected. We do not want the smaller platforms to be let off the hook unintentionally through loopholes. I ask the Minister to be mindful of this and to ensure that even smaller platforms such as online community groups, forums or message boards, where some of the most harmful content in relation to suicide and self-harm can be found, are covered as well. It is also really important that the Government agree with the idea of ensuring that we cannot have the age of 18 as a cut-off point, because that would be to miss an opportunity with this Bill. As I said, my constituent Joe, a popular young person, was 23, so the fact that he was over 18 did not mean that he was safe from this kind of harmful online content.

It is important to reflect on the written evidence that was sent to the Committee from people who had contacted the Samaritans. We have heard this already, but it is important to reiterate it. In relation to ensuring that the smaller platforms are covered by the legislation, one piece of evidence said:

“If suicidal people can’t find what they are looking for at large sites they will just go onto the smaller sites so it doesn’t solve the problem.”

Another said:

“The people using the bigger sites will just flood the smaller sites if their content starts getting removed. The standard needs to be the same across all sites.”

Unfortunately, some of the people behind this harmful content are very ingenious when it comes to evading responsibility. Things get taken down and then put up somewhere else. We cannot allow them to carry on doing this, because people are paying the price with their mental health and with their lives. Another respondent with lived experience told the Committee:

“Harmful and accessible suicide and self harm online content can be harmful at any age. I am in my fifties and would be tempted to act on this information if I felt suicidal again.”

That was quoted by my hon. Friend the Member for Blaydon (Liz Twist), but I think it is worth reiterating.

In conclusion, I welcome this report. I know from speaking to the Secretary of State outside the Chamber after DCMS questions that she takes preventing suicide and self-harm very seriously. Will the Minister confirm that, in relation to suicide and self-harm content, all platforms and people of all ages will be in scope in the final Bill that is presented to the House? That would be a real legacy for the campaign inspired by Joe Nihill, who sadly lost his life in my constituency.

16:25
Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Parliament Live - Hansard - - - Excerpts

I come to this report through the prism of my work on body image. The Minister will be pleased to hear that I will not give again the speech that I delivered yesterday, when he was kind enough to join proceedings on my private Member’s Bill about digitally altered body images that should carry a logo. Although I would welcome the Government taking on that Bill, I have to play on the Government’s playing field, which has led me to assess this Bill through that prism.

I should congratulate the Government on what they are trying to achieve: a world-leading, world-beating risk assessment across the internet. To achieve that would be no mean feat. I have not heard mentioned enough the role that Ofcom will play. Having met Ofcom, I know that it would need the tools and ability to investigate and to levy very heavy fines and punishments on companies for breaching the rules. They are going to be the key to holding this all together.

Body image falls on the side of content that is legal but harmful. Clause 46(3) of the draft Bill states:

“Content is within this subsection if the provider of the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities”.

It repeats that in several versions. I am pleased to see that that matches up with the report, but I appreciate that there is a difference of opinion on whether clause 11 should remain. Both pick up on the fact that

“Knowingly false communications likely to cause significant physical or psychological harm to a reasonable person”

should be called out. The report goes on to state:

“As with the other safety duties, we recommend that Ofcom be required to issue a mandatory code of practice to service providers on how they should comply with this duty. In doing so they must identify features and processes that facilitate sharing and spread of material in these named areas and set out clear expectations of mitigation and management strategies”.

After reading those points, both in the Bill and the report, I think a gap has been missed. There is no problem with seeing one doctored image; it is the volume of doctored images—the repeated images of shoulders distorted, waists thinner, breasts bigger—that has an impact. That is the same with people who are looking for information on dietary requirements. My hon. Friend the Member for Gosport (Dame Caroline Dinenage), who is no longer in her place, hit the nail on the head perfectly. It is about algorithms. That is where I want the Bill to be stronger. In every meeting that I have had with TikTok, Instagram, Facebook or Snapchat—you name it—when I have asked about algorithms, they say, “We can’t tell you more about it because it’s commercially sensitive,” but they are fundamentally what is driving us down the rabbit holes that the report rightly picks up on. How will we in this House determine what things look like if we do not understand what is driving them there in the first place? The horse has literally left the stables by the time we are picking up the pieces.

I am pleased that in previous debates the Minister has said that Ofcom will be able to request this information, but I would ask that we go one step further and say that that information could be exposed to the public. Why? Because that will undermine the whole model driving these companies in their commercial activity, because it will lay it bare for us all to see. That is key to the transparency that we need. Otherwise, how do we police the volume of images that are being delivered to our young people, whether they are body images or about self-harm, race hate or race-baiting, or whatever people want to call or it or whatever their niche happens to be? As we heard in this debate, social media plays on not only people’s interests, but their insecurities. That is what we have to tighten up on. The Bill and this report, working in conjunction, can really do that. However, I urge that the volume and, most importantly, the algorithms are considered.

16:30
Kevin Hollinrake Portrait Kevin Hollinrake (Thirsk and Malton) (Con)
- Parliament Live - Hansard - - - Excerpts

It is a pleasure to be called in this important debate, Madam Deputy Speaker. I wish to talk about online fraud; in my capacity as chair of the all-party group on fair business banking and as a member of the Treasury Committee, I think that is a matter of extreme importance. I congratulate the Joint Committee on its work, and my hon. Friend the Member for Folkestone and Hythe (Damian Collins), its Chairman, has done excellent work on this, particularly in pages 58 to 60 and 75 to 79 of the report.

It is good to see that the Government are looking at online fraud within the context of this Bill, but they must look at paid-for content as well. It is crucial that we do that, and the Minister has been very good in engaging on this issue. He knows how important it is, given his background. When the FCA, the Treasury, UK Finance, the Advertising Standards Authority and the Treasury Committee are all in favour of including fraud and paid-for content within the scope of this Bill, it is incumbent on the Government to do so. Otherwise, as the Treasury Committee says, there will be large financial losses to the public. Up to 40% of all crime is now fraud and, as the report says, 85% of fraud involves the internet in some way or other, so it is crucial that we cover this in the Bill.

I am massively in favour of competition and absolutely congratulate the platforms on their market dominance, but they have taken that market share in paid-for content away from our local newspapers and other such media. It is therefore crucial that we put those platforms on a fair and level playing field with those other media. I do not think we do that, and we need to be far tougher with these platforms. Clearly, they make a huge amount of money, but in many ways they get away with murder in terms of the regulation of their content, in a way that newspapers never would have done.

In my days in business, when we were advertising in newspapers we had to prove that we were who we said we were in terms of being a business, and the newspapers would look at the content of our adverts and they made sure we verified our claims. Neither of those things happens with respect to these platforms; they simply take the money and the approach is, “Let the people who are viewing it beware.” It is simply not a fair and level playing field. Newspapers were the gatekeepers but these platforms are absolutely not.

As my hon. Friend the Member for Boston and Skegness (Matt Warman) said, what works offline should be covered online, but that is not the situation at the moment. So I agree with the report that we need the platforms to be proactive in making sure that fraudulent content is removed. It needs to be covered under clause 41(4) and “priority illegal content”, so that platforms have to be proactive in taking this stuff down. We also need to be looking at clause 39 and removing paid-for ads to make sure that platforms are also proactive in removing paid-for online fraudulent content.

There is another thing we need to make sure is covered in the Bill in its final form. Fraud is not just an offence against individuals, as companies often get defrauded by mechanisms through these platforms, and we want to make sure they are covered as well. One way of doing that and focusing the platforms’ attention on this—I am not quite clear on this and I probably need to spend a bit of time with the Minister and the Chair of the Joint Committee on it—is by looking at what redress is available to people who do lose out. I know that the Committee is recommending an external redress process to cover this, but would that cover redress for financial loss? It think it should. So if the individual or the company cannot get redress through the company that they were defrauded by—it is pretty unlikely that they would—or the bank that facilitated the transaction, the platform should cover the redress to compensate those people for the loss. That would really focus the attention of platforms on making sure that this content was removed.

I am not sure that people know that the ASA, which looks at this kind of stuff and makes sure that advertising is appropriate, has no means of sanctioning a company for making claims that are not valid and do not meet with the expectations of the consumers. There are pretty much no sanctions for defrauding a company or individual in this way, or even for misleading them into buying the wrong product or making the wrong investment, as we saw with London Capital and Finance.

This Bill is a huge opportunity, and we have to make sure it is all-encompassing and ticks many of the boxes that others have spoken about in this debate.

16:34
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Parliament Live - Hansard - - - Excerpts

I also thank my friend, the hon. Member for Folkestone and Hythe (Damian Collins), for securing this important debate and for his skilled chairing of the Joint Committee. His expertise and diligence ensured a thorough pre-legislative process. I also extend my thanks to the hard-working Committee staff and to the other members of the Committee, with whom it has been a pleasure to work.

You will know, Madam Deputy Speaker, that SNP Members are not always fans of this place, but the Joint Committee was an example of cross-party co-operation aimed at delivering effective legislation. We all know that we need to find a way of keeping ourselves safe online and, in particular, of fighting disinformation, which is one of the scourges of our age.

I also thank our witnesses. I suspect some enjoyed the experience more than others. Some of those joining us from Silicon Valley looked more than a little uncomfortable at a few points, as indeed they should have given their inadequate testimony.

We all hope the online harms Bill will do what no legislation has done before by providing a proper regulatory framework for the internet. Its ambition and scope require a collaborative approach, and I welcome the UK Government’s recognition of this. For too long the social media companies have been given carte blanche to make eye-watering amounts of money while spreading hatred, disinformation and harmful content.

With 11 evidence sessions and over 200 written submissions, I am able to highlight only certain key findings from the Committee, and I do not want to repeat what others have said, unusual though that is in this House. Some findings stand out. During those sessions we heard from Rio Ferdinand about the racist abuse he receives and the devastating effect it has on his family. We heard about the misogynistic abuse and harassment that more than a third of all women receive online as a matter of routine. And we heard from Nancy Kelley of Stonewall about the abuse that LGBTQ+ people receive on social media. Stonewall highlighted the real-world consequences of this abuse, with people outed on social media losing their job or their home. Trans people have been subjected to an avalanche of online abuse, with a steep rise in offline abuse, including violent attacks, as a result.

The evidence sessions also illuminated why this abuse is so prevalent on social media. The more extreme and controversial the content, the more likely it is that people will interact with it. Algorithms push this harmful content on to people’s newsfeeds, leading to yet more people viewing it. The reason for all this is cynically simple: the more that people view this content and stay on the platform, the more advertisements that the big tech companies are able to sell, and 99% of Facebook’s income is from advertising. The problem is at the heart of the business model.

Just over a year ago, the US Capitol building was stormed by protesters who attempted to “stop the steal.” Many of those taking part had been radicalised by disinformation that had spread like wildfire online. In these islands, despite the best efforts of all our health bodies, we have seen vaccine hesitancy. Social media is a gift to anti-vaxxers. It allows them to spread their conspiracy theories and lies to a vast audience. Regulating societal harms is by no means easy, but the UK Government must listen to the experts and the Committee report and reintroduce the measures on societal harms into the Bill. Only then will we be able to start tackling disinformation adequately.

Finally, the extensive powers granted by the Bill to the Secretary of State must be addressed. London School of Economics professor Dr Damian Tambini described the powers as

“closer to authoritarian than to liberal democratic standards even with the safeguards”.

The proposed powers undermine the authority and independence of the regulator. The Government propose that the Secretary of State should be able to set strategic priorities for Ofcom and direct Ofcom to make amendments to reflect Government policy. What a disturbing mix. While the Minister is doubtless relieved that the Department is in the hands of the current calm and level-headed Secretary of State, imagine if one day a left-wing snowflake held that power—I see him shuddering from afar. Those powers should be significantly diluted or preferably removed, as the report recommends.

The Bill is ambitious and the report makes non-partisan, well considered recommendations for it. I join cross-party Members in asking the Government to make the internet safer for everyone by adopting those recommendations.

16:40
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Parliament Live - Hansard - - - Excerpts

I join the House in congratulating the hon. Member for Folkestone and Hythe (Damian Collins) on convening the debate and on all his hard and excellent work in leading the Joint Committee. This has been one of the most interesting debates in this place that I have ever participated in and, given the urgent need to improve online safety, it could not have come at a more crucial time. I am grateful to all members of the Joint Committee, who had a tough job in cleaning up this confusing and long-delayed Bill. It has been a very long time coming.

Current legislation on the online space is from the analogue age and lags far behind the digital age in which most of us now live. The Bill has the potential to be the world-leading legislation that we need it to be, sending a message to social media giants who, for too long, have got away with allowing—and in some cases even promoting—harmful content online. That cannot be allowed to continue. Most of us recognise the huge impact of the Government’s failure to regulate the online space, notably on young people, yet still, as the Joint Committee report suggests, the draft legislation is not ambitious or broad enough in scope to tackle the issues at their root.

Let me be clear that some of the trends emerging online can be extremely detrimental to both physical and mental wellbeing. We have all heard desperately tragic stories involving young people. We heard such stories today from the hon. Member for Stourbridge (Suzanne Webb) and from my hon. Friends the Members for Leeds East (Richard Burgon) and for Reading East (Matt Rodda) about young people harming themselves, taking their own lives and, in some cases, even being murdered at the hands of social media. I pay tribute to Molly Russell, Joe Nihill and Olly Stephens and to their families and friends for campaigning to make social media a much safer place so that no other young people have to go through what they did.

We all know about the other harms faced online, from the spread of fake news—including dangerous anti-vax content—to financial scams offering supposedly lucrative incentives that can be hard to decipher even for the most internet-literate of people. However, despite years of warnings from the Opposition alongside campaigning groups and charities, the Government have so far failed to take robust action. In my constituency, whenever I meet young people through a school visit or a community group, the conversations almost always centre around a common interest: social media. I know that those sentiments are not unique to my area. That is why it is so utterly wrong that tech giants have been left unaccountable for so long. Labour therefore welcomes the Joint Committee’s recommendations calling for the Government to hold online tech giants to account for the design and operation of their systems. We firmly believe that regulation should be governed through legislation and by an independent regulator instead of by a distant body in Silicon Valley.

In recent weeks, we have been reminded once again of the real power and influence of material shared online in generating and spreading fake news. In the pandemic, tackling dangerous anti-vax content is critical to vaccinating the unvaccinated. With the majority of people requiring serious care in hospital for coronavirus being unvaccinated, Government inaction and complacency in tackling dangerous anti-vax sentiment is costing lives and putting pressure on the NHS,

Labour have repeatedly called on the Government to work cross-party to introduce emergency legislation that includes financial and criminal penalties for companies who fail to act to stamp out dangerous anti-vax content, yet once again they have failed to act. They must stand up to big tech companies. As my hon. Friend the Member for Newcastle upon Tyne Central (Chi Onwurah) said, we must ignore those companies’ excuses and introduce financial and criminal penalties for failures that lead to serious harm. That is echoed in the Joint Committee’s report, which recommends that responsibilities of individuals at the very top of online tech organisations go further, with full accountability for the messages that those companies are hosting and, at times, even promoting.

In our dialogue about the responsibilities of tech firms, we must remember that we need to consider the role of so-called niche organisations, too. In line with that, Labour commends the Committee’s recognition of concerns raised by Hope not Hate and the Antisemitism Policy Trust, among others, about the harms caused by these alternative platforms. Our party leader raised concerns about one such example—Telegram—during Prime Minister’s questions, and there are numerous other platforms on which misogyny, racism and homophobia run rampant, including BitChute, Gab, BrandNewTube and 4chan, to name just a few. It is absolutely right that the Government look again at categorisation so that harm caused on and by such platforms is assessed by risk and not the current determinants of size and functionality.

The Committee has also rightly noted that, while search does not operate in the same way as user-to-user platforms do, harm can still be caused through algorithmic programming and auto-prompts. We therefore urge the Government to include search engines and search services within the regulatory scope of the Bill, recognising that they, too, have a role to play in addressing not just illegal but legal and harmful content, too.

This brings me to another excellent recommendation raised by the Joint Committee. Notably excluded from the draft legislation is the ability to regulate and hold social media giants accountable for paid-for advertising hosted on their websites. We have heard from a host of Members from across the House about how important it is that that should be included in the legislation. The Committee concluded that

“The exclusion of paid-for advertising from the scope of the Online Safety Bill would obstruct the Government’s stated aim of tackling online fraud and activity that creates a risk of harm more generally.”

The Government have repeatedly claimed that regulating paid-for advertisements is beyond the scope of this legislation and that instead it will be the role of the online advertising programme to manage how adverts are monitored. But we are now almost three years down the line since the OAP was first mentioned, and we still have little more than a press release and an outdated call for evidence to confirm exactly how the programme will function.

As right hon. and hon. Members, including my right hon. Friend the Member for East Ham (Stephen Timms), have said, the Government must adopt the Joint Committee’s recommendation and expand the Bill to include paid-for adverts, which are central to so many instances of fraud and harm online more generally. Of course, crucial to this debate is therefore the need to define exactly what constitutes harm. As the Joint Committee recommends, the Government must publish their definition as soon as possible. The concept of harm underpins the entire evolution of how this legislation will be drafted and eventually enacted. It is vital that the Government’s definition is published before the Bill is introduced to ensure that Ofcom, as the regulator, is fully prepared and resourced for its role. I hope that the Minister will be able to give the House an update on that point in his comments.

I move on to address some of the detail in the Committee’s recommendations. I am pleased that the Committee has addressed many of the concerns raised about the complexity of the Bill in its current form, and Labour supports the move to bring our focus firmly back to the regulation of social media giants’ systems and progress. The Joint Committee’s report rightly reflects the strong concerns about the scale of the Secretary of State’s powers in the Bill, and we have heard from other Members about concerns regarding the scope of the regulator’s independence in many areas. We also know very little about the disinformation and misinformation unit, and that is required—Madam Deputy Speaker, I could go on. We know that this legislation is vital.

To conclude, I believe that, without the big changes recommended by the Joint Committee alongside a faster-paced and increased understanding of the wider issues, more people will find themselves at risk of harms online. The danger is that, even with the excellent recommendations of the Joint Committee, the Online Safety Bill will be inadequate and simply out of date when it eventually becomes law. The Government have a once-in-a-generation opportunity to change that, and I urge the Minister to take seriously the concerns raised by Members in the House.

16:48
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Parliament Live - Hansard - - - Excerpts

I congratulate my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on securing today’s debate and chairing the Joint Committee with such aplomb and expertise. I thank Members from all parties on the Committee—from not just this House, but the other place—for their incredibly hard work. I put on the record my thanks to them; as my hon. Friend said, Baroness Kidron and Lord Gilbert are with us today. I thank them all for their extremely thorough and detailed work. We have been studying their report—all 191 pages—very carefully, and it will definitely have an impact on the legislation as it is updated.

I also thank the Select Committee on Digital, Culture, Media and Sport and its Chair, my hon. Friend the Member for Solihull (Julian Knight), for its work. I look forward very much to its report, which my hon. Friend said would be published imminently. I encourage the Committee to ensure that it is indeed published as soon as possible, so that we can take account of its recommendations as well. I can confirm that we will be making changes to the Bill in the light of the recommendations of the Joint Committee report and those of the anticipated report from the Select Committee. We understand that there are a number of respects in which the Bill can be improved substantially. The Government certainly have no monopoly on wisdom, and we intend to profit from the huge experience of the members of the Committees, and Members of the House, in making improvements—significant improvements —to the Bill. We intend to produce a revised and updated Bill before the end of the current Session.

We intend this Bill to be a world-leading piece of legislation. We believe that the United Kingdom has an opportunity to set a global example which other countries will follow. As the hon. Member for Pontypridd (Alex Davies-Jones) said, the Bill has been some time in gestation, but because this is such a complicated topic, it is important that we get the legislation right.

This is, I think, a good moment to thank previous Secretaries of State and Ministers for the work that they did in laying the foundations on which we are now building—in fact, in building the walls as well; we are just putting the roof on. In particular, I know of the work done in this area by my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) and my right hon. Friend the Member for Basingstoke (Mrs Miller), and also the work done by my hon. Friends the Members for Gosport (Dame Caroline Dinenage) and for Boston and Skegness (Matt Warman). I am sure that the whole House will want to thank them for the fantastic work that they did in taking us to the point where we now stand.

I entirely agree with the sentiments expressed by the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe, who said in his opening speech that social media firms had brought this legislation on themselves by the irresponsibility that they have often shown by placing profit ahead of humanity. That was powerfully illustrated by the evidence presented to the Joint Committee, and separately to the United States Senate and The Wall Street Journal, by the Facebook whistleblower Frances Haugen, who explained how Facebook’s use of algorithms—mentioned by Members, including my hon. Friends the Members for Gosport and for Bosworth (Dr Evans)—prioritised profit by promoting content that was harmful or incendiary simply because it made money, with scant, if any, regard to the harm being caused. Our view is that such an attitude is not only inappropriate but wrong.

Two or three Members have referred to the tragic suicide of 14-year-old Molly Russell, which followed a huge amount of very troubling suicide-related content being served up to her by Instagram. That sort of thing simply should not be happening. There are all too many other examples of social media firms not promptly handing over identification information to the police—I encountered a constituency case of that kind a couple of years ago—and not taking down content that is illegal, or content that clearly contravenes their terms and conditions.

This state of affairs cannot persist, and it is right for the House to act. I am heartened to note that, broadly speaking, we will be acting on a cross-party basis, because I think that that will make the message we send the world and the action we are taking all the more powerful. However, as Members have said today, even before the Act is passed, social media firms can act. They can edit their algorithms tomorrow, and I urge them to do exactly that. They should not be waiting for us to legislate; they should do the right thing today. We will be watching very closely: the House will be watching, and the public will be watching.

Kevin Hollinrake Portrait Kevin Hollinrake
- Hansard - - - Excerpts

Will my hon. Friend give way?

Kevin Hollinrake Portrait Kevin Hollinrake
- Hansard - - - Excerpts

I will be very brief. My hon. Friend has talked about cross-party working, and there is clearly cross-party consensus that paid-for advertising should be included in the scope of the Bill. Is that something that he intends to do?

Chris Philp Portrait Chris Philp
- Parliament Live - Hansard - - - Excerpts

My hon. Friend anticipates my next point. I was about to come on to some of the specifics—very quickly, because time is short.

I am not going to be pre-announcing any firm commitments today because work is still ongoing, including the collective agreement process in Government, but on fraud and paid-for advertising, we have heard the message of the Joint Committee, the Financial Conduct Authority, the financial services sector, campaigners and Members of this House such as my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake). The right hon. Member for East Ham (Stephen Timms) raised this, as did the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Cities of London and Westminster (Nickie Aiken). I was at Revolut’s head office in Canary Wharf earlier today and it raised the issue as well. It is a message that the Government have absolutely heard, and it is something that we very much hope we will be able to address when we bring the Bill forward.

I cannot make any specific commitments because the work is still ongoing, but that message is loudly heard, as is the message communicated by the right hon. Member for Barking, my right hon. Friend the Member for Basingstoke and the hon. Member for Bath (Wera Hobhouse) on the work by the Law Commission on the communications offences, which will really tighten up some of the issues to do with what are essentially malicious or harmful communications, issues such as cyber-flashing and issues to do with epilepsy that we have heard about this afternoon. We are studying those Law Commission proposals very positively and carefully, as the Joint Committee recommended that we do.

We have also heard clearly the messages concerning commercial pornography. We understand the issues presented by the fact that the Bill, as drafted, does not cover that. Again, that is something we are currently working on very hard indeed.

Anonymity is another important issue raised today by my right hon. Friend the Member for Basingstoke and the hon. Member for Upper Bann (Carla Lockhart), among others. They and the Joint Committee have suggested that users should be given the option to protect themselves from anonymous content. They also addressed the critical question of traceability when law enforcement needs to investigate something. Again, those messages have been heard very clearly and we are working very hard on those.

That brings me to the tragic case raised by the hon. Member for Reading East (Matt Rodda) of his constituent Olly, who was so appallingly murdered; the murder appears to have been organised online. Under the Bill as drafted, organising an act like that—an illegal act—will be dealt with. I have just mentioned the point about traceability, which we are studying very carefully. The hon. Member said he had some concerns that the social media companies concerned did not provide the police with the identification information required when requested. I had a similar case a couple of years ago with Snapchat. If he could look into the details of that and come back to me with the specifics, I would be very interested to hear those because that would give us additional evidence if further steps need to be taken via the amended Bill. If he could come back to me on that, I would be very grateful.

A number of Members have rightly raised the point about transparency and understanding exactly what these social media firms are doing. The right hon. Member for Barking made that point powerfully, as did the hon. Member for Newcastle upon Tyne Central (Chi Onwurah). Of course, the Bill does give Ofcom extremely wide-ranging powers to require information to be delivered up. It also imposes transparency obligations upon these companies. There are criminal sanctions on individuals if those provisions are broken, and we have heard clearly the suggestion that those be brought forward and commenced much earlier. The Bill will also contain strong protections for free speech. I have not got time to talk about that more, but protecting free speech clearly is very important.

The country demands action, this House demands action and we will take it.

15:24
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I thank all Members who contributed to what has been an excellent debate. We have heard from Members from each nation of the United Kingdom and almost every political party represented in the House as well, all of whom were supporting the principle of the Bill and supporting a majority of the recommendations in the report. I think we all share an urgency that we want to get this done.

Members spoke not just out of an appreciation of the policy issues, but from personal experience. The right hon. Member for Barking (Dame Margaret Hodge) talked about the abuse that she has received, as so many other Members of the House have. The hon. Member for Reading East (Matt Rodda) raised a case on behalf of his constituents and my hon. Friend the Member for Bosworth (Dr Evans) did so with regards to his campaign on body image. We know ourselves, from our personal experience and the experience of our constituents, why it is necessary for legislation on this. There is also a question about how the House scrutinises the powers we will give Ofcom and how the regime will work in the future.

Question put and agreed to.

Resolved,

That this House has considered the Report of the Joint Committee on the draft Online Safety Bill, HC 609.