All 3 Debates between Baroness Laing of Elderslie and Damian Collins

Mon 28th Nov 2016
Digital Economy Bill
Commons Chamber

3rd reading: House of Commons & Legislative Grand Committee: House of Commons & Programme motion No. 3: House of Commons & Report stage: House of Commons

BBC Licence Fee Non-Payment (Decriminalisation for over 75s) Bill

Debate between Baroness Laing of Elderslie and Damian Collins
Damian Collins Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Damian Collins)
- Hansard - - - Excerpts

In the short time I have, I will address the concessionary licence fee for the over-75s and provide the necessary context for a range of relevant issues, including the BBC’s decision to end free TV licences for the over-75s, the Government’s work on decriminalising TV licence fee evasion and our broader road map for BBC reform, including our intention to review the licence fee funding model.

The House will no doubt be aware that, in the 2015 funding settlement, the Government agreed with the BBC that the responsibility for the over-75s concession should transfer to the BBC. The Government and the BBC agreed to make that change. Alongside that, the Government also closed the iPlayer loophole, committed to increase the licence fee in line with inflation and reduced a number of other spending commitments. To help with the financial planning, the Government agreed to provide phased transitional funding over two years to gradually—

Draft Online Safety Bill Report

Debate between Baroness Laing of Elderslie and Damian Collins
Thursday 13th January 2022

(2 years, 10 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. The hon. Gentleman is not trying to make a speech, is he? No, he is not.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman raises an important issue. The Committee agreed in the report that there must be an expedited process of transparency, so that when people are using anonymity to abuse other people—saying things for which in public they might be sued or have action taken against them—it must be much easier to swiftly identify who those people are. People must know that if they post hate online directed at other people and commit an offence in doing so, their anonymity will not be a shield that will protect them: they will be identified readily and action taken against them. Of course there are cases where anonymity may be required, when people are speaking out against an oppressive regime or victims of abuse are telling their story, but it should not be used as a shield to abuse others. We set that out in the report and the hon. Gentleman is right that the Bill needs to move on it.

We are not just asking the companies to moderate content; we are asking them to moderate their systems as well. Their systems play an active role in directing people towards hate and abuse. A study commissioned by Facebook showed that over 60% of people who joined groups that showed extremist content did so at the active recommendation of the platform itself. In her evidence to the Committee, Facebook whistleblower Frances Haugen made clear the active role of systems in promoting and driving content through to people, making them the target of abuse, and making vulnerable people more likely to be confronted with and directed towards content that will exacerbate their vulnerabilities.

Facebook and companies like it may not have invented hate but they are driving hate and making it worse. They must be responsible for these systems. It is right that the Bill will allow the regulator to hold those companies to account not just for what they do or do not take down, but for the way they use the systems that they have created and designed to make money for themselves by keeping people on them longer, such that they are responsible for them. The key thing at the heart of the Bill and at the heart of the report published by the Joint Committee is that the companies must be held liable for the systems they have created. The Committee recommended a structural change to the Bill to make it absolutely clear that what is illegal offline should be regulated online. Existing offences in law should be written into the Bill and it should be demonstrated how the regulator will set the thresholds for enforcement of those measures online.

This approach has been made possible because of the work of the Law Commission in producing its recommendations, particularly in introducing new offences around actively promoting self-harm and promoting content and information that is known to be false. A new measure will give us the mechanism to deal with malicious deepfake films being targeted at people. There are also necessary measures to make sure that there are guiding principles that the regulator has to work to, and the companies have to work to, to ensure regard to public health in dealing with dangerous disinformation relating to the pandemic or other public health issues.

We also have to ensure an obligation for the regulator to uphold principles of freedom of expression. It is important that effective action should be taken against hate speech, extremism, illegal content and all harmful content that is within the scope of the Bill, but if companies are removing content that has every right to be there—where the positive expression of people’s opinions has every right to be online—then the regulator should have the power to intervene in that direction as well.

At the heart of the regime has to be a system where Ofcom, as the independent regulator, can set mandatory codes and standards that we expect the companies to meet, and then use its powers to investigate and audit them to make sure that they are complying. We cannot have a system that is based on self-declared transparency reports by the companies where even they themselves struggle to explain what the results mean and there is no mechanism for understanding whether they are giving us the full picture or only a highly partial one. The regulator must have that power. Crucially, the codes of practice should set the mandatory minimum standards. We should not have Silicon Valley deciding what the online safety of citizens in this country should be. That should be determined through legislation passed through this Parliament empowering the regulator to set the minimum standards and take enforcement action when they have not been met.

We also believe that the Bill would be improved by removing a controversial area, the principles in clause 11. The priority areas of harm are determined by the Secretary of State and advisory to the companies. If we base the regulatory regime and the codes of practice on established offences that this Parliament has already created, which are known and understood and therefore enforced, we can say they are mandatory and clear and that there has been a parliamentary approval process in creating the offences in the first place.

Where new areas of harm are added to the schedules and the codes of practice, there should be an affirmative procedure in both Houses of Parliament to approve those changes to the code, so that Members have the chance to vote on changes to the codes of practice and the introduction of new offences as a consequence of those offences being created.

The Committee took a lot of evidence on the question of online fraud and scams. We received evidence from the Work and Pensions Committee and the Treasury Committee advising us that this should be done: if a known scam or attempt to rip off and defraud people is present on a website or social media platform, be it through advertising or any kind of posting, it should be within the scope and it should be for the regulator to require its removal. There should not be a general blanket exemption for advertising, which would create a perverse incentive to promote such content more actively.

Digital Economy Bill

Debate between Baroness Laing of Elderslie and Damian Collins
3rd reading: House of Commons & Legislative Grand Committee: House of Commons & Programme motion No. 3: House of Commons & Report stage: House of Commons
Monday 28th November 2016

(7 years, 12 months ago)

Commons Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Amendment Paper: Consideration of Bill Amendments as at 28 November 2016 - (28 Nov 2016)
Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Mrs Eleanor Laing)
- Hansard - -

Order. We have one hour and one minute left in this debate and many Members want to speak—and I suspect they will also wish to have answers from the Minister and would not like to truncate his contribution to the debate. I cannot impose a time limit; I can only ask for courtesy from one Member to another and short speeches. I am not suggesting speeches so far have been too long, but I ask Members to speak as quickly as they possibly can.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will try to adhere to your guidelines, Madam Deputy Speaker.

I would like to speak to new clause 31, but first I want to congratulate the hon. Member for Washington and Sunderland West (Mrs Hodgson) on her campaigning over many years to deal with the abuses in the secondary ticketing market. I also want to congratulate my Select Committee colleague, my hon. Friend the Member for Selby and Ainsty (Nigel Adams), who took up this issue strongly in the Bill Committee. In fact, the new clause that we are discussing tonight is exactly the same as the one he tabled for discussion in Committee. Such was the power of his argument that he persuaded the hon. Member for Cardiff West (Kevin Brennan) to pursue this matter on Report, and I am grateful to the shadow Minister for agreeing that the Select Committee could table this new clause for discussion on Report.