Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I will speak in support of Amendments 250A and 250B; I am not in favour of Amendment 56, which is the compromise amendment. I thank the noble Baroness, Lady Newlove, for setting out the reasons for her amendments in such a graphic form. I declare an interest as a member of the Expert Group on an Individual Complaints Mechanism for the Government of Ireland.
The day a child or parent in the UK has a problem with an online service and realises that they have nowhere to turn is the day that the online safety regime will be judged to have failed in the eyes of the public. Independent redress is a key plank of any regulatory system. Ombudsmen and independent complaint systems are available across all sectors, from finance and health to utilities and beyond. As the noble Lord, Lord Stevenson, set out, they are part of all the tech regulation that has been, or is in the process of being, introduced around the world.
I apologise in advance if the Minister is minded to agree to the amendment, but given that, so far, the Government have conceded to a single word in a full six days in Committee, I dare to anticipate that that is not the case and suggest three things that he may say against the amendment: first, that any complaints system will be overwhelmed; secondly, that it will offer a get-out clause for companies from putting their own robust systems in place; and, thirdly, that it will be too expensive.
The expert group of which I was a member looked very carefully at each of these questions and, after taking evidence from all around the globe, it concluded that the system need not be overwhelmed if it had the power to set clear priorities. In the case of Ireland, those priorities were complaints that might result in real-world violence and complaints from or on behalf of children. The expert group also determined that the individual complaints system should be
“afforded the discretion to handle and conclude complaints in the manner it deems most appropriate and is not unduly compelled toward or statutorily proscribed to certain courses of action in the Bill”.
For example, there was a lot of discussion on whether it could decide not to deal with copycat letters, treat multiple complaints on the same or similar issue as one, and so on.
Also, from evidence submitted during our deliberations, it became clear that many complainants have little idea of the law and that many complaints should be referred to other authorities, so among the accepted recommendations was that the individual complaints system should be
“provided with a robust legal basis for transferring or copying complaints to other bodies as part of the triage process”—
for example, to the data regulator, police, social services and other public bodies. The expert group concluded that this would actually result in better enforcement and compliance in the ecosystem overall.
On the point that the individual complaints mechanism may have the unintended consequence of making regulated services lazy, the expert group—which, incidentally, comprised a broad group of specialisms such as ombudsmen, regulators and legal counsel among others—concluded that it was important for the regulator to set a stringent report and redress code of practice for regulated companies so that it was not possible for any company to just sit back until people were so fed up that they went to the complaints body. The expert group specifically said in its report that it
“is acutely aware of the risk of … the Media Commission … drawing criticism for the failings of the regulated entities to adequately comply with systemic rules. In this regard, an individual complaints mechanism should not be viewed as a replacement for the online platforms’ complaint handling processes”.
Indeed, the group felt that an individual complaints system complemented the powers given to the regulator, which could and should take enforcement against those companies that persistently fail to introduce an adequate complaints system—not least because the flow of complaints would act as an early warning system of emerging harms, which is of course one of the regulator’s duties under the Bill.
When replying to a question from the noble Lord, Lord Knight of Weymouth, last week about funding digital literacy, the Minister made it clear that the online safety regime would be self-financing via the levy. In which case, it does not seem to be out of proportion to have a focused and lean system in which the urgent, the vulnerable and the poorly served have somewhere to turn.
The expert group’s recommendation was accepted in full by Ireland’s Minister for Media, Culture and Tourism, Catherine Martin, who said she would
“always take the side of the most vulnerable”
and the complaint system would deal with people who had
“exhausted the complaints handling procedures by any online services”.
I have had the pleasure of talking to its new leadership in recent weeks, and it is expected to be open for business in 2024.
I set that out at length just to prove that it is possible. It was one of the strong recommendations of the pre-legislative committee, and had considerable support in the other place, as we have heard. I think both Ofcom and DSIT should be aware that many media outlets have not yet clocked that this complicated Bill is so insular that the users of tech have no place to go and no voice.
While the Bill can be pushed through without a complaints system, this leaves it vulnerable. It takes only one incident or a sudden copycat rush of horrors, which have been ignored or trivialised by the sector with complainants finding themselves with nowhere to go but the press, to undermine confidence in the whole regulatory edifice.
My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.
I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?
If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.
I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.
Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.
The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.
The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.
We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.
It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.
This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.
I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.
Following on from my friend, the noble Lord, Lord Russell, can I just say to the Minister that I would really welcome all of us having a meeting? As I am listening to this, I am thinking that three to five years is just horrific for the families. This Bill has gone on for so long to get where we are today. We are losing sight of humanity here and the moral compass of protecting human lives. For whichever Government is in place in three to five years to make the decision to say it does not work is absolutely shameful. Nobody in the Government will be accountable and yet for that family, that single person may commit suicide. We have met the bereaved families, so I say to the Minister that we need to go round the table and look at this again. I do not think it is acceptable to say that there is this timeline, this review, for the Secretary of State when we are dealing with young lives. It is in the public interest to get this Bill correct as it navigates its way back to the House of Commons in a far better state than how it arrived.
I would love the noble Viscount to answer my very specific question about who the Government think families should turn to when they have exhausted the complaints system in the next three to five years. I say that as someone who has witnessed successive Secretaries of State promising families that this Bill would sort this out. Yes?
I stress again that the period in question is two years not three.
It is between two and five years. It can be two; it can be five. I am very happy to meet my noble friend and to carry on doing so. The complaints procedure set up for families is to first approach the service provider in an enforceable manner and should the provider fail to meet its enforceable duties to then revert to Ofcom before the courts.
I am sorry but that is exactly the issue at stake. The understanding of the Committee currently is that there is then nowhere to go if they have exhausted that process. I believe that complainants are not entitled to go to Ofcom in the way that the noble Viscount just suggested.
Considerably more rights are provided than they have today, with the service provider. Indeed, Ofcom would not necessarily deal with individual complaints—
I have offered a meeting; I am very happy to host the meeting to bottom out these complaints.
I understand that the Minister has been given a sticky wicket of defending the indefensible. I welcome a meeting, as I think the whole Committee does, but it would be very helpful to hear the Government say that they have chosen to give individuals no recourse under the Bill—that this is the current situation, as it stands, and that there is no concession on the matter. I have been in meetings with people who have been promised such things, so it is really important, from now on in Committee, that we actually state at the Dispatch Box what the situation is. I spent quite a lot of the weekend reading circular arguments, and we now need to get to an understanding of what the situation is. We can then decide, as a Committee, what we do in relation to that.
As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.
I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.
Yes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.
I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.
My Lords, I thank the Minister for his response. I take it as a win that we have been offered a meeting and further discussion, and the noble Lord, Lord Foulkes, agreeing with every word I said. I hope we can continue in this happy vein in my time in this House.
The suggestion from the noble Lord, Lord Stevenson, of a table is a welcome one. Something that has interested me is that some of the offences the Minister mentioned were open goals: there were holes leaving it open in Northern Ireland and not in England and Wales, or whatever. For example, epilepsy trolling is already a criminal offence in Scotland, but I am not sure that was appreciated when we started this discussion.
I look forward to the meeting and I thank the Minister for his response. I am still unconvinced that we have the right consultation process for any devolved authority wanting to apply for a subordinate devolved Administration to be included under this regime.
It concerns me that the Minister talked about leaving requesting data that Ofcom deemed to be appropriate. The feeling on the ground is that Ofcom, which is based in London, may not understand what is or is not necessarily appropriate in the devolved Administrations. The fact that in other legislation—for example, on broadcasting—it is mandated that it is broken down nation by nation is really important. It is even more important because of the interplay between the devolved and the reserved matters. The fact that there is no equivalent Minister in the Scottish Government to talk about digital and online safety things with means that a whole raft of different people will need to have relationships with Ofcom who have not hitherto.
I thank the Minister. On that note, I withdraw my amendment.
My Lords, I also support the amendments in the name of my noble friend Lady Finlay. I want to address a couple of issues raised by the noble Lord, Lord Allan. He made a fantastic case for adequate redress systems, both at platform level and at independent complaint level, to really make sure that, at the edge of all the decisions we make, there is sufficient discussion about where that edge lies.
The real issue is not so much the individuals who are in recovery and seeking to show leadership but those who are sent down the vortex of self-harm and suicide material that comes in its scores—in its hundreds and thousands—and completely overwhelms them. We must not make a mistake on the edge case and not deal with the issue at hand.
There is absolutely not enough signposting. I have seen at first hand—I will not go through it again; I have told the Committee already—companies justifying material that it was inconceivable to justify as being a cry for help. A child with cuts and blood running down their body is not a cry for help; that is self-harm material.
From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.
I absolutely agree. Of course, good law is a good system, not a good person.
I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.
In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.
Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.
I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.
I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.
My Lords, I support the noble Baroness, Lady Finlay of Llandaff, in her Amendment 96 and others in this group. The internet is fuelling an epidemic of self-harm, often leading to suicide among young people. Thanks to the noble Baroness, Lady Kidron, I have listened to many grieving families explaining the impact that social media had on their beloved children. Content that includes providing detailed instructions for methods of suicide or challenges or pacts that seek agreement to undertake mutual acts of suicide or deliberate self-injury must be curtailed, or platforms must be made to warn and protect vulnerable adults.
I recognise that the Government acknowledge the problem and have attempted to tackle it in the Bill with the new offence of encouraging or assisting serious self-harm and suicide and by listing it as priority illegal content. But I agree with charities such as Samaritans, which says that the Government are taking a partial approach by not accepting this group of amendments. Samaritans considers that the types of suicide and self-harm content that is legal but unequivocally harmful includes information, depictions, instructions and advice on methods of self-harm or suicide, content that portrays self-harm and suicide as positive or desirable and graphic descriptions or depictions of self-harm and suicide.
With the removal of regulation of legal but harmful content, much suicide and self-harm content can remain easily available, and platforms will not even need to consider the risk that such content could pose to adult users. These amendments aim to ensure that harmful self-harm and suicide content is addressed across all platforms and search services, regardless of their functionality or reach, and, importantly, for all persons regardless of age.
In 2017 an inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s and 13% of deaths in 20 to 24 year-olds. Three-quarters of people who took part in Samaritans’ research with Swansea University said that they had harmed themselves more severely after viewing self-harm content online, as the noble Baroness, Lady Finlay, pointed out. People of all ages can be susceptible to harm from this dangerous content. There is shocking evidence that between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were over 25.
Suicide is complex and rarely caused by one thing. However, there is strong evidence of associations between financial difficulties, mental health and suicide. People on the lowest incomes have a higher suicide risk than those who are wealthier, and people on lower incomes are also the most affected by rising prices and other types of financial hardship. In January and February this year the Samaritans saw the highest percentage of first-time phone callers concerned about finance or unemployment—almost one in 10 calls for help in February. With the cost of living crisis and growing pressure on adults to cope with stress, it is imperative that the Government urgently bring in these amendments to help protect all ages from harmful suicide and self-harm content by putting a duty on providers of user-to-user services to properly manage such content.
A more comprehensive online safety regime for all ages will also increase protections for children, as research has shown that age verification and restrictions across social media and online platforms are easily bypassed by them. As the Bill currently stands, there is a two-tier approach to safety which can still mean that children may circumnavigate safety controls and find this harmful suicide and self-harm content.
Finally, user empowerment duties that we debated earlier are no substitute for regulation of access to dangerous suicide and self-harm online content through the law that these amendments seek to achieve.
My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.
I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?
I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.
I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.
The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.
It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.
The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.
Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.
I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.
The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.
My Lords, I have added my name to Amendments 97 and 304, and I wholeheartedly agree with all that the noble Baroness, Lady Morgan, said by means of her excellent introduction. I look forward to hearing what the noble Baroness, Lady Kidron, has to say as she continues to bring her wisdom to the Bill.
Let me say from the outset, if it has not been said strongly enough already, that violence against women and girls is an abomination. If we allow a culture of intimidation and misogyny to exist online, it will spill over to offline experiences. According to research by Refuge, almost one in five domestic abuse survivors who experienced abuse or harassment from their partner or former partner via social media said they felt afraid of being attacked or being subjected to physical violence as a result. Some 15% felt that their physical safety was more at risk, and 5% felt more at risk of so-called honour-based violence. Shockingly, according to Amnesty International, 41% of women who experienced online abuse or harassment said that these experiences made them feel that their physical safety was threatened.
Throughout all our debates, I hesitate to differentiate between the real and virtual worlds, because that is simply not how we live our lives. Interactions online are informed by face-to-face interactions, and vice versa. To think otherwise is to misunderstand the lived experience of the majority—particularly, dare I say, the younger generations. As Anglican Bishop for HM Prisons, I recognise the complexity of people’s lives and the need to tackle attitudes underpinning behaviours. Tackling the root causes of offending should always be a priority; there is potential for much harm later down the line if we ignore warning signs of hatred and misogyny. Research conducted by Refuge found that one in three women has experienced online abuse or harassment perpetrated on social media or another online platform at some point in their lives. That figure rises to almost two in three, or 62%, among young women. This must change.
We did some important work in your Lordships’ House during the passage of the Domestic Abuse Act to ensure that all people, including women and girls, are safe on our streets and in their homes. As has been said, introducing a code of practice as outlined will help the Government meet their aim of making the UK the safest place in the world to be online, and it will align with the Government’s wider priority to tackle violence against women and girls as a strategic policing requirement. Other strategic policing requirements, including terrorism and child sexual exploitation, have online codes of practice, so surely it follows that there should be one for VAWG to ensure that the Bill aligns with the Government’s position elsewhere and that there is not a gap left online.
I know the Government care deeply about tackling violence against women and girls, and I believe they have listened to some concerns raised by the sector. The inclusion of the domestic abuse and victims’ commissioners as statutory consultees is welcomed, as is the Government’s amendment to recognise controlling and coercive behaviour as a priority offence. However, without this code of conduct, the Bill will fail to address duties of care in relation to preventing domestic abuse and violence against women and girls in a holistic and encompassing way. The onus should not be on women and girls to remove themselves from online spaces; we have seen plenty of that in physical spaces over the years. Women and girls must be free to appropriately express themselves online and offline without fear of harassment. We must do all we can to prevent expressions of misogyny from transforming into violent actions.
My Lords, I have added my name to Amendments 97 and 304, and I support the others in this group. It seems to be a singular failure of any version of an Online Safety Bill if it does not set itself the task of tackling known harms—harms that are experienced daily and for which we have a phenomenal amount of evidence. I will not repeat the statistics given in the excellent speeches made by the noble Baroness, Lady Morgan, and the right reverend Prelate, but will instead add two observations.