(1 year, 6 months ago)
Lords ChamberMy Lords, I have added my name to Amendments 97 and 304, and I wholeheartedly agree with all that the noble Baroness, Lady Morgan, said by means of her excellent introduction. I look forward to hearing what the noble Baroness, Lady Kidron, has to say as she continues to bring her wisdom to the Bill.
Let me say from the outset, if it has not been said strongly enough already, that violence against women and girls is an abomination. If we allow a culture of intimidation and misogyny to exist online, it will spill over to offline experiences. According to research by Refuge, almost one in five domestic abuse survivors who experienced abuse or harassment from their partner or former partner via social media said they felt afraid of being attacked or being subjected to physical violence as a result. Some 15% felt that their physical safety was more at risk, and 5% felt more at risk of so-called honour-based violence. Shockingly, according to Amnesty International, 41% of women who experienced online abuse or harassment said that these experiences made them feel that their physical safety was threatened.
Throughout all our debates, I hesitate to differentiate between the real and virtual worlds, because that is simply not how we live our lives. Interactions online are informed by face-to-face interactions, and vice versa. To think otherwise is to misunderstand the lived experience of the majority—particularly, dare I say, the younger generations. As Anglican Bishop for HM Prisons, I recognise the complexity of people’s lives and the need to tackle attitudes underpinning behaviours. Tackling the root causes of offending should always be a priority; there is potential for much harm later down the line if we ignore warning signs of hatred and misogyny. Research conducted by Refuge found that one in three women has experienced online abuse or harassment perpetrated on social media or another online platform at some point in their lives. That figure rises to almost two in three, or 62%, among young women. This must change.
We did some important work in your Lordships’ House during the passage of the Domestic Abuse Act to ensure that all people, including women and girls, are safe on our streets and in their homes. As has been said, introducing a code of practice as outlined will help the Government meet their aim of making the UK the safest place in the world to be online, and it will align with the Government’s wider priority to tackle violence against women and girls as a strategic policing requirement. Other strategic policing requirements, including terrorism and child sexual exploitation, have online codes of practice, so surely it follows that there should be one for VAWG to ensure that the Bill aligns with the Government’s position elsewhere and that there is not a gap left online.
I know the Government care deeply about tackling violence against women and girls, and I believe they have listened to some concerns raised by the sector. The inclusion of the domestic abuse and victims’ commissioners as statutory consultees is welcomed, as is the Government’s amendment to recognise controlling and coercive behaviour as a priority offence. However, without this code of conduct, the Bill will fail to address duties of care in relation to preventing domestic abuse and violence against women and girls in a holistic and encompassing way. The onus should not be on women and girls to remove themselves from online spaces; we have seen plenty of that in physical spaces over the years. Women and girls must be free to appropriately express themselves online and offline without fear of harassment. We must do all we can to prevent expressions of misogyny from transforming into violent actions.
My Lords, I have added my name to Amendments 97 and 304, and I support the others in this group. It seems to be a singular failure of any version of an Online Safety Bill if it does not set itself the task of tackling known harms—harms that are experienced daily and for which we have a phenomenal amount of evidence. I will not repeat the statistics given in the excellent speeches made by the noble Baroness, Lady Morgan, and the right reverend Prelate, but will instead add two observations.
(1 year, 6 months ago)
Lords ChamberI want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.
On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.
My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.
I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.
As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.
Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.
The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.