All 3 Debates between Dean Russell and Kim Leadbeater

Tue 21st Jun 2022
Tue 24th May 2022
Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading

Online Safety Bill (Fourteenth sitting)

Debate between Dean Russell and Kim Leadbeater
Committee stage
Tuesday 21st June 2022

(2 years, 6 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I move the amendment in my name and will speak to amendment 113, which is in the name of the hon. Member for Blackpool North and Cleveleys (Paul Maynard).

The amendment would put into effect Zach’s law in full. Zach, as many Members know, is an amazing, energetic and bright young boy from my constituency. I had the absolute pleasure of visiting Zach and his mum Clare at their home in Hartshead a few weeks ago. We chatted about school and his forthcoming holiday, and he even invited me to the pub. However, Zach also has epilepsy.

Disgustingly, he was trolled online a few years ago and sent flashing images by bullies, designed to trigger his condition and give him an epileptic seizure, a seizure that not only would cause him and his family great distress, but can be extremely dangerous and cause Zach significant psychological and physical harm. I know that we are all united in our disgust at such despicable actions and committed to ensuring that this type of unbelievable online bullying is against the law under the Bill.

On Second Reading, I raised the matter directly with the Minister and I am glad that he pointed to clause 150 and stated very explicitly that subsection (4) will cover the type of online harm that Zach has encountered. However, we need more than just a commitment at the Dispatch Box by the Minister, or verbal reassurances, to protect Zach and the 600,000 other people in the UK with epilepsy.

The form of online harm that Zach and others with epilepsy have suffered causes more than just “serious distress”. Members know that the Bill as drafted lists

“psychological harm amounting to at least serious distress”

as a qualifying criterion of the offence. However, I believe that does not accurately and fully reflect the harm that epilepsy trolling causes, and that it leaves a significant loophole that none of us here wish to see exploited

For many people with epilepsy, the harm caused by this vicious online trolling is not only psychological but physical too. Seizures are not benign events. They can result in broken bones, concussion, bruises and cuts, and in extreme cases can be fatal. It is simply not right to argue that physical harm is intrinsically intertwined with psychological harm. They are different harms with different symptoms. While victims may experience both, that is not always the case.

Professor Sander, medical director of the Epilepsy Society and professor of neurology at University College London Hospitals NHS Foundation Trust, who is widely considered one of the world’s leading experts on epilepsy, has said:

“Everyone experiences seizures differently. Some people may be psychologically distressed by a seizure and not physically harmed. Others may be physically harmed but not psychologically distressed. This will vary from person to person, and sometimes from seizure to seizure depending on individual circumstances.”

Amendment 112 will therefore expand the scope of clause 150 and insert on the face of the Bill that an offence will also be committed under the harmful communications clause when physical harm has occurred as a consequence of receiving a message sent online with malicious intent. In practical terms, if a person with epilepsy were to receive a harmful message online that triggers their epilepsy and they subsequently fall off their chair and hit their head, that physical harm will be proof of a harmful communication offence, without the need to prove any serious psychological distress that may have been caused.

This simple but effective amendment, supported by the Epilepsy Society, will ensure that the horrific trolling that Zach and others with epilepsy have had to endure will be covered in full by the Bill. That will mean that the total impact that such trolling has on the victims is reflected beyond solely psychological distress, so there can be no ambiguity and nowhere for those responsible for sending these images and videos to hide.

I am aware that the Minister has previously pointed to the possibility of a standalone Bill—a proposal that is under discussion in the Ministry of Justice. That is all well and good, but that should not delay our action when the Bill before us is a perfectly fit legislative vehicle to end epilepsy trolling, as the Law Commission report recommended.

I thank colleagues from across the House for the work they have done on this important issue. I sincerely hope that the amendment is one instance where we can be united in this Committee. I urge the Minister to adopt amendment 112, to implement Zach’s law in full and to provide the hundreds of thousands of people across the UK living with epilepsy the legal protections they need to keep them safe online. It would give me no greater pleasure than to call at Zach’s house next time I am in the area and tell him that this is the case.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

May I praise the hon. Member for Batley and Spen for such an eloquent and heartfelt explanation of the reason why this amendment to the Bill is so important?

I have been campaigning on Zach’s law for the past nine months. I have spoken to Zach multiple times and have worked closely with my hon. Friend the Member for Stourbridge (Suzanne Webb) in engaging directly with Facebook, Twitter and the big platforms to try to get them to do something, because we should not need to have a law to stop them sending flashing images. We had got quite far a few months ago, but now that seems to have stalled, which is very frustrating.

I am stuck between my heart and my head on this amendment. My heart says we need to include the amendment right now, sort it out and get it finalised. However, my head says we have got to get it right. During the Joint Committee for Online Safety before Christmas and in the evidence sessions for this Bill, we heard that if the platforms want to use a loophole and get around things they will. I have even seen that with regard to the engagements and the promises we have had.

Dean Russell Portrait Dean Russell
- Hansard - -

That is an excellent point. I have yet to make up my mind which way to vote if the amendment is pressed to a vote; I do not know whether this is a probing amendment. Having spoken to the Epilepsy Society and having been very close to this issue for many months, for me to feel comfortable, I want the Minister not just to say, as he has said on the Floor of the House, to me personally, in meetings and recently here, that the clause should cover epilepsy, and does seem to, and that he is very confident of that, but to give some assurance that we will change the law in some form.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I am incredibly grateful for the hon. Member’s comments and contribution. I agree wholeheartedly. We need more than a belief and an intention. There is absolutely no reason why we cannot have this in black and white in the Bill. I hope he can find a way to do the right thing today and vote for the amendment.

Dean Russell Portrait Dean Russell
- Hansard - -

The phrase “Do the right thing” is at the heart of this. My hon. Friend the Member for Ipswich (Tom Hunt) presented the Flashing Images Bill yesterday. A big part of this is about justice. I am conscious that we have got to get the balance right; stopping this happening has an impact for the people who choose to do this. I am keen to hear what the Minister says. We have got to get this right. I am keen to get some assurances, which will very much sway my decision on the vote today.

Online Safety Bill (First sitting)

Debate between Dean Russell and Kim Leadbeater
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

Dean Russell Portrait Dean Russell
- Hansard - -

Q Do the powers in the Bill cover enough to ensure that people will not be sent flashing images if they have photosensitive epilepsy?

Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.

Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.

Online Safety Bill

Debate between Dean Russell and Kim Leadbeater
2nd reading
Tuesday 19th April 2022

(2 years, 8 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

I had the great privilege of sitting on the Joint Committee on the draft Bill before Christmas and working with the Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), fantastic Members from across both Houses and amazing witnesses.

We heard repeated stories of platforms profiting from pain and prejudice. One story that really affected me was that of Zach Eagling, a heroic young boy who has cerebral palsy and epilepsy and who was targeted with flashing images by cruel trolls to trigger seizures. Those seizures have been triggered for other people with epilepsy, affecting their lives and risking not just harm, but potentially death, depending on their situation. That is why I and my hon. Friend the Member for Stourbridge (Suzanne Webb)—and all members of the Joint Committee, actually, because this was in our report—backed Zach’s law.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Ten-year-old Zach is a child in my constituency who has, as the hon. Member said, cerebral palsy and epilepsy, and he has been subjected to horrendous online abuse. I hope that the Minister can provide clarity tonight and confirm that Zach’s law—which shows that not just psychological harm and distress, but physical harm can be created as a result of online abuse and trolling—will be covered in the Bill.

Dean Russell Portrait Dean Russell
- Hansard - -

My understanding—hopefully this will be confirmed from the Dispatch Box—is that Zach’s law will be covered by clause 150 in part 10, on communications offences, but I urge the Ministry of Justice to firm that up further.

One thing that really came through for me was the role of algorithms. The only analogy that I can find in the real world for the danger of algorithms is narcotics. This is about organisations that focused on and targeted harmful content to people to get them to be more addicted to harm and to harmful content. By doing that, they numbed the senses of people who were using technology and social media, so that they engaged in practices that did them harm, turning them against not only others, but themselves. We heard awful stories about people doing such things as barcoding—about young girls cutting themselves—which was the most vile thing to hear, especially as a parent myself. There was also the idea that it was okay to be abusive to other people and the fact that it became normalised to hurt oneself, including in ways that can be undoable in future.

That leads on to a point about numbing the senses. I am really pleased that in debating the Bill today we have talked about the metaverse, because the metaverse is not just some random technology that we might talk about; it is about numbing the senses. It is about people putting on virtual reality headsets and living in a world that is not reality, even if it is for a matter of minutes or hours. As we look at these technologies and at virtual reality, my concern is that children and young people will be encouraged to spend more time in worlds that are not real and that could include more harmful content. Such worlds are increasingly accurate in their reality, in the impact that they can have and in their capability for user-to-user engagement.

I therefore think that although at the moment the Bill includes Meta and the metaverse, we need to look at it almost as a tech platform in its own right. We will not get everything right at first; I fully support the Bill as it stands, but as we move forward we will need to continue to improve it, test it and adapt it as new technologies come out. That is why I very much support the idea of a continuing Joint Committee specifically on online safety, so that as time goes by the issues can be scrutinised and we can look at whether Ofcom is delivering in its role. Ultimately, we need to use the Bill as a starting point to prevent harm now and for decades to come.