Draft Online Safety Bill Report Debate
Full Debate: Read Full DebateDamian Collins
Main Page: Damian Collins (Conservative - Folkestone and Hythe)Department Debates - View all Damian Collins's debates with the Department for Digital, Culture, Media & Sport
(2 years, 9 months ago)
Commons ChamberI beg to move,
That this House has considered the Report of the Joint Committee on the draft Online Safety Bill, HC 609.
I would like to start by thanking the members and Clerks of our Joint Committee, who put in a tremendous effort to deliver its report. In 11 sitting weeks, we received more than 200 submissions of written evidence, took oral evidence from 50 witnesses and held four further roundtable meetings with outside experts, as well as Members of both Houses. I am delighted to see my Joint Committee colleagues Lord Gilbert and Baroness Kidron in the Gallery. I thank the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Croydon South (Chris Philp), and the Secretary of State for the open and collaborative way in which they worked with the Committee throughout the process and our deliberations. I also thank Ofcom, which provided a lot of constructive guidance and advice to the Committee as we prepared the report.
This feels like a moment that has been a long time coming. There has been huge interest on both sides of the House in the Online Safety Bill ever since the publication of the first White Paper in April 2019, and then there were two Government responses, the publication of the draft Bill and a process of pre-legislative scrutiny by the Joint Committee. I feel that the process has been worth while: in producing a unanimous report, I think the Committee has reflected the wide range of opinions that we received and put forward some strong ideas that will improve the Bill, which I hope will get a Second Reading later in the Session. I believe that it has been a process worth undertaking, and many other Lords and Commons Committees have been looking at the same time at the important issues around online safety and the central role that online services play in our lives.
The big tech companies have had plenty of notice that this is coming. During that period, have we seen a marked improvement? Have we seen the introduction of effective self-regulation? Have the companies set a challenge to Parliament, saying “You don’t really need to pass this legislation, because we are doing all we can already”? No. If anything, the problems have got worse. Last year, we saw an armed insurrection in Washington DC in which a mob stormed the Capitol building, fuelled by messages of hate and confrontation that circulated substantially online. Last summer, members of the England football team were subject to vile racist abuse at the end of the final—the football authorities had warned the companies that that could happen, but they did not prepare for it or act adequately at the time.
As Rio Ferdinand said in evidence to the Joint Committee, people should not have to put up with this. People cannot just put their device down—it is a tool that they use for work and to stay in communication with their family and friends—so they cannot walk away from the abuse. If someone is abused in a room, they can leave the room, but they cannot walk away from a device that may be the first thing that they see in the morning and one of the last things that they see at night.
We have seen an increase in the incidence of child abuse online. The Internet Watch Foundation has produced a report today that shows that yet again there are record levels of abusive material related to children, posing a real child safety risk. It said the same in its report last year, and the issues are getting worse. Throughout the pandemic, we have seen the rise of anti-vaccine conspiracies.
I commend the hon. Gentleman for bringing this forward. We have a colleague in Northern Ireland, Diane Dodds MLA, who has had unbelievably vile abuse towards her and her family. Does the hon. Gentleman agree that there is a huge loophole and gap in this Bill—namely, that the anonymity clause remains that allows comments such as those to my colleague and friend Diane Dodds, which were despicable in the extreme? There will be no redress and no one held accountable through this Bill. The veil of anonymity must be lifted and people made to face the consequences of what they are brave enough to type but not to say.
Order. The hon. Gentleman is not trying to make a speech, is he? No, he is not.
The hon. Gentleman raises an important issue. The Committee agreed in the report that there must be an expedited process of transparency, so that when people are using anonymity to abuse other people—saying things for which in public they might be sued or have action taken against them—it must be much easier to swiftly identify who those people are. People must know that if they post hate online directed at other people and commit an offence in doing so, their anonymity will not be a shield that will protect them: they will be identified readily and action taken against them. Of course there are cases where anonymity may be required, when people are speaking out against an oppressive regime or victims of abuse are telling their story, but it should not be used as a shield to abuse others. We set that out in the report and the hon. Gentleman is right that the Bill needs to move on it.
We are not just asking the companies to moderate content; we are asking them to moderate their systems as well. Their systems play an active role in directing people towards hate and abuse. A study commissioned by Facebook showed that over 60% of people who joined groups that showed extremist content did so at the active recommendation of the platform itself. In her evidence to the Committee, Facebook whistleblower Frances Haugen made clear the active role of systems in promoting and driving content through to people, making them the target of abuse, and making vulnerable people more likely to be confronted with and directed towards content that will exacerbate their vulnerabilities.
Facebook and companies like it may not have invented hate but they are driving hate and making it worse. They must be responsible for these systems. It is right that the Bill will allow the regulator to hold those companies to account not just for what they do or do not take down, but for the way they use the systems that they have created and designed to make money for themselves by keeping people on them longer, such that they are responsible for them. The key thing at the heart of the Bill and at the heart of the report published by the Joint Committee is that the companies must be held liable for the systems they have created. The Committee recommended a structural change to the Bill to make it absolutely clear that what is illegal offline should be regulated online. Existing offences in law should be written into the Bill and it should be demonstrated how the regulator will set the thresholds for enforcement of those measures online.
This approach has been made possible because of the work of the Law Commission in producing its recommendations, particularly in introducing new offences around actively promoting self-harm and promoting content and information that is known to be false. A new measure will give us the mechanism to deal with malicious deepfake films being targeted at people. There are also necessary measures to make sure that there are guiding principles that the regulator has to work to, and the companies have to work to, to ensure regard to public health in dealing with dangerous disinformation relating to the pandemic or other public health issues.
We also have to ensure an obligation for the regulator to uphold principles of freedom of expression. It is important that effective action should be taken against hate speech, extremism, illegal content and all harmful content that is within the scope of the Bill, but if companies are removing content that has every right to be there—where the positive expression of people’s opinions has every right to be online—then the regulator should have the power to intervene in that direction as well.
At the heart of the regime has to be a system where Ofcom, as the independent regulator, can set mandatory codes and standards that we expect the companies to meet, and then use its powers to investigate and audit them to make sure that they are complying. We cannot have a system that is based on self-declared transparency reports by the companies where even they themselves struggle to explain what the results mean and there is no mechanism for understanding whether they are giving us the full picture or only a highly partial one. The regulator must have that power. Crucially, the codes of practice should set the mandatory minimum standards. We should not have Silicon Valley deciding what the online safety of citizens in this country should be. That should be determined through legislation passed through this Parliament empowering the regulator to set the minimum standards and take enforcement action when they have not been met.
We also believe that the Bill would be improved by removing a controversial area, the principles in clause 11. The priority areas of harm are determined by the Secretary of State and advisory to the companies. If we base the regulatory regime and the codes of practice on established offences that this Parliament has already created, which are known and understood and therefore enforced, we can say they are mandatory and clear and that there has been a parliamentary approval process in creating the offences in the first place.
Where new areas of harm are added to the schedules and the codes of practice, there should be an affirmative procedure in both Houses of Parliament to approve those changes to the code, so that Members have the chance to vote on changes to the codes of practice and the introduction of new offences as a consequence of those offences being created.
The Committee took a lot of evidence on the question of online fraud and scams. We received evidence from the Work and Pensions Committee and the Treasury Committee advising us that this should be done: if a known scam or attempt to rip off and defraud people is present on a website or social media platform, be it through advertising or any kind of posting, it should be within the scope and it should be for the regulator to require its removal. There should not be a general blanket exemption for advertising, which would create a perverse incentive to promote such content more actively.
I thank my hon. Friend for his work on this important issue. Does he agree, as referred to in the report, that platforms must be required to proactively seek out that content and ensure it is changed, and if not, remove it, rather than all removals being prompted by users?
It is vital that companies are made to act proactively. That is one of the problems with the current regime, where action against illegal content is only required once it is reported to the companies and they are not proactively identifying it. My hon. Friend is right about that, particularly with frauds and scams where the perpetrators are known. The role of the regulator is to ensure that companies do not run those ads. The advertising authorities can still take action against individual advertisers, as can the police, but there should be a proactive responsibility on the platforms themselves.
If you will allow me to say one or two more things, Madam Deputy Speaker, we believe it is important that there should be user redress through the system. That is why the Committee recommended creating an ombudsman if complaints have been exhausted without successful resolution, but also permitting civil redress through the courts.
If an individual or their family has been greatly harmed as a consequence of what they have seen on social media, they may take some solace in the fact that the regulator has intervened against the company for its failures and levied fines or taken action against individual directors. However, as an individual can take a case to the courts for a company’s failure to meet its obligations under data protection law, that should also apply to online safety legislation. An individual should have the right, on their own or with others, to sue a company for failing to meet its obligations under an online safety Act.
I commend the report to the House and thank everyone involved in its production for their hard work. This is a Bill we desperately need, and I look forward to seeing it pass through the House in this Session.
I thank all Members who contributed to what has been an excellent debate. We have heard from Members from each nation of the United Kingdom and almost every political party represented in the House as well, all of whom were supporting the principle of the Bill and supporting a majority of the recommendations in the report. I think we all share an urgency that we want to get this done.
Members spoke not just out of an appreciation of the policy issues, but from personal experience. The right hon. Member for Barking (Dame Margaret Hodge) talked about the abuse that she has received, as so many other Members of the House have. The hon. Member for Reading East (Matt Rodda) raised a case on behalf of his constituents and my hon. Friend the Member for Bosworth (Dr Evans) did so with regards to his campaign on body image. We know ourselves, from our personal experience and the experience of our constituents, why it is necessary for legislation on this. There is also a question about how the House scrutinises the powers we will give Ofcom and how the regime will work in the future.
Question put and agreed to.
Resolved,
That this House has considered the Report of the Joint Committee on the draft Online Safety Bill, HC 609.