All 2 Debates between Damian Collins and Charlotte Nichols

Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting

ONLINE SAFETY BILL (Second sitting)

Debate between Damian Collins and Charlotte Nichols
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

As much as I am keen on the idea of Ofcom special agents conceptually, my concern on the transparency front is that, to appoint a special agent and send them in to look at the data, Ofcom would have to have cause to believe that there was an issue of concern with the data, whereas if that data is more transparently available to the research community, they can then proactively identify things that they can flag to Ofcom as a concern. Without that, we are relying on an annual cycle of Ofcom being able to intervene only when they have a concern, rather than the research community, which is much better placed to make that determination, being able to keep a watching brief on the company.

Damian Collins Portrait Damian Collins
- Hansard - -

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

ONLINE SAFETY BILL (First sitting)

Debate between Damian Collins and Charlotte Nichols
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.

Damian Collins Portrait Damian Collins
- Hansard - -

The hon. Lady is talking about an incredibly important issue, but the Bill covers such matters as credible threats to life, incitement to violence against an individual, and harassment and stalking—those patterns of behaviour. Those are public order offences, and they are in the Bill. I would absolutely expect companies to risk-assess for that sort of activity, and to be required by Ofcom to mitigate it. On her point about holocaust denial, first, the shield will mean that people can protect themselves from seeing stuff. The further question would be whether we create new offences in law, which can then be transposed across.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.

The provisions have taken out the risk assessments that need to be done. The Bill says,

“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults”.

Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.

I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.

I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.