Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Burt of Solihull
Main Page: Baroness Burt of Solihull (Liberal Democrat - Life peer)Department Debates - View all Baroness Burt of Solihull's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, this is my first contribution to the Bill, and I feel I need to apologise in advance for my lack of knowledge and expertise in this whole field. In her initial remarks, the noble Baroness, Lady Morgan of Cotes, was saying “Don’t worry, because you don’t need to be a lawyer”. Unfortunately, I do not have any expertise in the field of the internet and social media and all of that as well, so I will be very brief in all of my remarks on the Bill. But I feel that I cannot allow the Bill to go past without at least making a few remarks, as equalities spokesperson for the Lib Dems. The issues are of passionate importance to me, and of course to victims of online abuse, and it is those victims for whom I speak today.
In this group, I will address my remarks to Amendments 34 and 35, in which we have discussed content deemed to be harmful—suicide, self-harm, eating disorders and abuse and hate content—under the triple shield approach, although this content discussion has strayed somewhat during the course of the debate.
Much harmful material, as we have heard, initially comes to the user uninvited. I do not pretend to understand how these algorithms work, but my understanding is that if you open one, they literally click into action, increasing more and more of this kind of content being fed to you in your feed. The suicide of young Molly Russell is a typical example of the devastating consequences of how much damage these algorithms can contribute. I am glad that the Bill will go further to protect children, but it still leaves adults—some young and vulnerable—without some protection and with the same amount of automatic exposure to harmful content, which algorithms can increase with engagement, which could have overwhelming impacts on their mental health, as my noble friend Lady Parminter so movingly and eloquently described.
So this amendment means a user would have to make an active, conscious choice to be exposed to such content: an opt out rather than an opt in. This has been discussed at length by noble Lords a great deal more versed in the subject than me. But surely the only persons or organisations who would not support this would be the ones who do not have the best interests of the vulnerable users we have been talking about this afternoon at heart. I hope the Minister will confirm in his remarks that the Government do.
My Lords, I had not intended to speak in this debate because I now need to declare an unusual interest, in that Amendment 38A has been widely supported outside this Chamber by my husband, the Member of Parliament for Weston-super-Mare. I am not intending to speak on that amendment but, none the less, I mention it just in case.
I rise to speak because I have been so moved by the speeches, not least the right reverend Prelate’s speech. I would like just to briefly address the “default on” amendments and add my support. Like others, on balance I favour the amendments in the name of the noble Lord, Lord Clement-Jones, but would willingly throw my support behind my noble friend Lady Morgan were that the preferred choice in the Chamber.
I would like to simply add two additional reasons why I ask my noble friend the Minister to really reflect hard on this debate. The first is that children become teenagers, who become young adults, and it is a gradual transition—goodness, do I feel it as the mother of a 16 year-old and a 17 year-old. The idea that on one day all the protections just disappear completely and we require our 18 year-olds to immediately reconfigure their use of all digital tools just does not seem a sensible transition to adulthood to me, whereas the ability to switch off user empowerment tools as you mature as an adult seems a very sensible transition.
Secondly, I respect very much the free speech arguments that the noble Baroness, Lady Fox, made but I do not think this is a debate about the importance of free speech. It is actually about how effective the user empowerment tools are. If they are so hard for non-vulnerable adults to turn off, what hope have vulnerable adults to be able to turn them on? For the triple shield to work and the three-legged stool to be effective, the onus needs to be on the tech companies to make these user empowerment tools really easy to turn on and turn off. Then “default on” is not a restriction on freedom of speech at all; it is simply a means of protecting our most vulnerable.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Burt of Solihull
Main Page: Baroness Burt of Solihull (Liberal Democrat - Life peer)Department Debates - View all Baroness Burt of Solihull's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, it is an honour to follow some very knowledgeable speakers, whose knowledge is much greater than mine. Nevertheless, I feel the importance of this debate above and beyond any other that I can think of on this Bill. However, I do not agree with the noble Baroness, Lady Stowell of Beeston, who said that women should not be victims. They are not victims; they are being victimised. We need a code—the code that is being proposed—not for the victims but for the tech companies, because of the many diverse strands of abuse that women face online. This is an enabler for the tech companies to get their heads around what is coming and to understand it a lot better. It is a helpful tool, not a mollycoddling tool at all.
I strongly agree with everything else, apart from what was said by the noble Baroness, Lady Fox, which I will come on to in a second. I and, I am sure, other noble Lords in this Chamber have had many hundreds of emails from concerned people, ordinary people, who nevertheless understand the importance of what this code of practice will achieve today. I speak for them, as well as the others who have supported this particularly important amendment.
As their supporters have pointed out in this Chamber, Amendments 97 and 304 are the top priority for the Domestic Abuse Commissioner, who believes that, if they do not pass, the Bill will not go far enough to prevent and respond effectively to domestic abuse online. The noble Baroness, Lady Fox, spoke about the need to keep a sense of proportion, but online abuse is everywhere. According to the charity Refuge—I think this was mentioned earlier—over one-third of women and 62% of young women have experienced online abuse and harassment.
I am sure that the Minister is already aware that a sector coalition of experts on violence against women and girls put together the code of practice that we are discussing today. It is needed, as I have said, because of the many strands of abuse that are perpetuated online. However, compliance with the new terms of service to protect women and girls is not cheap. In cost- driven organisations, the temptation will be to relax standards as time goes by, which we have seen in the past in the cases of Facebook and Twitter. The operators’ feet must be held to the fire with this new, stricter and more comprehensive code. People’s lives depend on it.
In his remarks, can the Minister indicate whether the Government are at least willing to look at this code? Otherwise, can he explain how the Government will ensure that domestic abuse and its component offences are understood by providers in the round?
My Lords, I rise to support the noble Baronesses, Lady Morgan and Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth, on Amendment 97 to Clause 36 to mandate Ofcom to produce codes of practice, so that these influential online platforms have to respond adequately to tackle online violence against women and girls.
Why should we care about these codes of practice being in the Bill? Not doing so will have far-reaching consequences, of which we have already heard many examples. First, it will threaten progress on gender equality. As the world moves to an increasingly digital future, with more and more connections and conversations moving online, women must have the same opportunity as men to be a part of the online world and benefit from being in the online space.
Secondly, it will threaten the free speech of women. The voices of women are more likely to be suppressed. Because of abuse, women are more likely to reduce their social media activity or even leave social media platforms altogether.
Thirdly, we will be failing in our obligation to protect the human rights of women. Every woman has the right to be and feel safe online. I thank the noble Baroness, Lady Kidron, who highlighted online abuse due to intersecting identities. The noble Baroness, Lady Stowell, mentioned that this could cause divisions; there are divisions already, given the level of online abuse faced by women. Until we get an equal and just society, additional measures are needed. I know that the noble Baroness, Lady Fox, is worried about censorship, but women also have the right to feel safe online and offline. The noble Baroness is worried about whether this is a proportionate response, but I do feel that it is.
Relying on tech companies to self-regulate on VAWG is a bad idea. At present, the overwhelming majority of tech companies are led by men and their employees are most likely to be men, who will be taking decisions on content and on moderating that content. So we are relying on the judgment of a sector that itself needs to be more inclusive of women and is known for not sufficiently tackling the online abuse of women and girls.
I will give a personal example. Someone did not like what I said on Twitter and posted a message with a picture of a noose, which I found threatening. I reported that and got a response to say that it did not violate terms and conditions, so it remained online.
The culture at these tech companies was illustrated a few years ago when employees at Google walked out to protest against sexism. Also, research a couple of years ago by a campaign group called Global Witness found that Facebook used biased algorithms that promoted career and gender stereotypes, resulting in particular job roles being seen by men and others being seen by women. We know that other algorithms are even more harmful and sinister and promote hatred and misogyny. So relying on a sector that may not care much about women’s rights or their well-being to do the right thing is not going to work. Introducing the VAWG code in the Bill will help to make tech companies adequately investigate and respond to reports of abuse and take a proactive approach to minimise and prevent the risk of abuse taking place in the first instance.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Burt of Solihull
Main Page: Baroness Burt of Solihull (Liberal Democrat - Life peer)Department Debates - View all Baroness Burt of Solihull's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I will speak to Amendment 5B in my name and that of my noble friend Lord Clement-Jones. I am reminded that this is a new stage of the Bill, so I should declare my interests. I have no current financial interests in the tech sector, but until 2019 I worked for one of the large technology companies that will be regulated, doing the kind of censorship job that the noble Lord, Lord Moylan, is concerned about. We clearly did not do it very well or we would not be here today replacing people like me with Ofcom.
Amendment 5B concerns an issue that we raised in Committee: the offence of encouragement of self-harm. That new offence was broadly welcomed, including on these Benches. We believe that there is scope, in some circumstances, to seek criminal prosecution of individuals who, online or otherwise, maliciously seek to encourage other people to harm themselves. The concern we raised in Committee, which we come back to today, is that we want the offence to be used in a way that we would all agree is sensible. We do not want people who are trying to help individuals at risk of self-harm to become concerned about and afraid of it, and to feel that they need to limit activities that would otherwise be positive and helpful.
In Committee we suggested that one way to do this would be to have a filter where the Director of Public Prosecutions looked at potential prosecutions under the new offence. We take a different approach with the amendment, which would in some senses be more effective, which is to explicitly list in the Bill the three categories of activity that would not render an individual liable to prosecution.
The first is people who provide an educational resource. We should be clear that some educational resources that are intended to help people recognise self-harm and turn away from it can contain quite explicit material. Those people are concerned that they might, in publishing that material with good intent, accidentally fall foul of the offence.
The second category is those who provide support—individuals providing peer support networks, such as an online forum where people discuss their experience of self-harm and seek to turn away from it. They should not be inadvertently caught up in the offence.
The third category is people posting information about their own experience of self-harm. Again, that could be people sharing quite graphic material about what they have been doing to themselves. I hope that there would be general agreement that we would not interpret, for example, a distressed teenager sharing material about their own self-harm, with the intent of seeking advice and support from others, as in some way encouraging or assisting others to commit self-harm themselves.
There is a genuine effort here to try to find a way through so that we can provide assurances to others. If the Minister cannot accept the amendment as it is, I hope he will reaffirm that the categories of people that I described are not the target of the offence and that he will be able to offer some kind of assurance as to how they can feel confident that they would not fall foul of prosecution.
Additionally, some of these groups feel with some conviction that their voices have not been as prominent in the debate as those of other organisations. The work they do is quite sensitive, and they are often quite small organisations. Between Report and the Bill becoming law, I hope that those who will be responsible for doing the detailed work around guidance on prosecutions will meet with those people on the front line—again, specificity is all—and that those who are trying to work out how to make this legislation work will meet with the people doing that work, running those fora and engaging with the young people who seek help around self-harm to look in detail at what they are doing. That would be extraordinarily helpful.
Those are my two asks. Ideally, the Government would accept the amendment that we have tabled, but if not I hope that they can give the assurance that the three groups I listed are not the target and that they will commit to having relevant officials meet with individuals working on the front line, so that we can make sure that we do not end up prosecuting individuals without intending to.
My Lords, I support all the amendments in this group. However, what I have to say on my own amendments will take up enough time without straying on to the territory of others. I ask noble colleagues to please accept my support as read. I thank the Minister for meeting me and giving context and explanation regarding all the amendments standing in my name. I also welcome the government amendments on intimate image abuse in another group and on digitally altered images, which impinge directly on the cyberflashing amendments.
It is clear that the Government’s heart is in the right place, even if their acceptance of a consent-based law is not. I also thank the Law Commission for meeting me and explaining the thinking behind and practicalities of how the new law in relation to cyberflashing will work, and how the existing court system can help, such as juries deciding whether or not they believe the defendant. Last but definitely not least, I acknowledge the help that I have received from Professor Clare McGlynn, and Morgane Taylor from Bumble—both immensely knowledgeable and practical people who have inspired, informed and helped throughout.
I start with Amendments 5C and 7A in my name and that of the noble Baroness, Lady Finlay. I understand that the Government are following the advice of the Law Commission in refusing to accept consent-based defence, but I point out gently that this is something that the Government choose, and sometimes choose not, to do. Although the Law Commission consulted widely, that consultation did not show support for its proposals from victims and victims’ organisations. I am still of the view that a consent-based requirement would have prevented many unsolicited images being received by women and girls. I still worry that young girls may be socialised and sexualised by their peers who say that they are sending these images for a laugh. These girls do not have the maturity to say that they do not find it funny, but pretend it is okay while cringing with humiliation inside. Consent-based legislation would afford them the best protection and educate young girls and men that not only are women and girls frequently not interested in seeing a picture of a man’s willy, but girls think differently from boys about this. Who knew?
I also believe that a consent-based law would provide the most suitable foundation for education and prevention initiatives. However, I have listened to the Minister and the Law Commission. I have been told that, if it got to court, the complainant would not be humiliated all over again by having to give evidence in court and admit the distress and humiliation they felt. But according to the Minister, like the new intimate image amendment tabled by the Government themselves, it is up to the Crown Prosecution Service to follow it up and, after making their statement of complaint, my understanding is that the complainant does not have to take part further—more of that later. However, given the current success rate of only 4% of even charging alleged perpetrators in intimate image abuse cases, I worry that not only will victims continue to be reluctant to come forward but the chances of prosecution will be so slim that it will not act as a deterrent. We know from experience of sharing sexual images without consent, that the motivation thresholds have limited police investigations and prosecutions due to the evidential challenges. That is what the Law Commission has recommended as regards the introduction of a consent-based image offence.
I am very happy to make that commitment. It would be useful to have their continued engagement, as we have had throughout the drafting of the Bill.
The noble Baroness, Lady Burt of Solihull, has tabled a number of amendments related to the new offence of cyberflashing. I will start with her Amendment 6. We believe that this amendment reduces the threshold of the new offence to too great an extent. It could, for example, criminalise a person sending a picture of naked performance art to a group of people, where one person might be alarmed by the image but the sender sends it anyway because he or she believes that it would be well received. That may be incorrect, unwise and insensitive, but we do not think it should carry the risk of being convicted of a serious sexual offence.
Crucially, the noble Baroness’s amendment requires that the harm against the victim be proven in court. Not only does this add an extra step for the prosecution to prove in order for the perpetrator to be convicted, it creates an undue burden on the victim, who would be cross-examined about his or her—usually her—experience of harm. For example, she might have to explain why she felt humiliated; this in itself could be retraumatising and humiliating for the victim. By contrast, Clause 170 as drafted means that the prosecution has only to prove and focus on the perpetrator’s intent.
I am very grateful for the Minister’s comments. This is the crux of my confusion: I am not entirely sure why it is necessary for the victim to appear in court. In intimate image abuse, is it not the case that the victim does not have to make an appearance in court? What is the difference between intimate image abuse and cyberflashing abuse? I do not get why one attracts a physical court appearance and the other does not. They seem to be different sides of the same coin to me.
If a defendant said that he—usually he—had sent an image believing that the consent of the recipient was implied, the person making the complaint would be cross-examined on whether or not she had indeed given that consent. If an offence predicated on proof of non-consent or proof of harm were made out, the victim could be called to give evidence and be cross-examined in court. The defence would be likely to lead evidence challenging the victim’s characteristics and credibility. We do not want that to be a concern for victims; we do not want that to be a barrier to victims coming forward and reporting abuse for fear of having their sexual history or intentions cross-examined.
On the various protections already within that original amendment, if it went to court, why would the person who had sent the image get prosecuted if he or she had a good reason for having sent it?
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Burt of Solihull
Main Page: Baroness Burt of Solihull (Liberal Democrat - Life peer)Department Debates - View all Baroness Burt of Solihull's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, I rise briefly to support the noble Baroness, Lady Morgan, to welcome the government amendment and to say that this is a moment of delight for many girls—of all varieties. I echo the noble Baroness, Lady Fox, on the issue of having a broad consultation, which is a good idea. While our focus during the passage of this Bill was necessarily on preventing harm, I hope this guidance will be part of the rather more aspirational and exciting part of the digital world that allows young people to participate in social and civic life in ways that do not tolerate abuse and harm on the basis of their gender. In Committee, I said that we have a duty not to allow digital tech to be regressive for girls. I hope that this is a first step.
My Lords, on behalf of my party, all the groups mentioned by the noble Baroness, Lady Morgan, and potentially millions of women and girls in this country, I briefly express my appreciation for this government amendment. In Committee, many of us argued that a gender-neutral Bill would not achieve strong enough protection for women and girls as it would fail to recognise the gendered nature of online abuse. The Minister listened, as he has on many occasions during the passage of the Bill. We still have differences on some issues—cyberflashing, for instance—but in this instance I am delighted that he is amending the Bill, and I welcome it.
Why will Ofcom be required to produce guidance and not a code, as in the amendment originally tabled by the noble Baroness, Lady Morgan? Is there a difference, or is it a case of a rose by any other name? Is there a timescale by which Ofcom should produce this guidance? Are there any plans to review Ofcom’s guidance once produced, just to see how well it is working?
We all want the same thing: for women and girls to be free to express themselves online and not to be harassed, abused and threatened as they are today.
My Lords, this very positive government amendment acknowledges that there is not equality when it comes to online abuse. We know that women are 27 times more likely than men to be harassed online, that two-thirds of women who report abuse to internet companies do not feel heard, and three out of four women change their behaviour after receiving online abuse.
Like others, I am very glad to have added my name to support this amendment. I thank the Minister for bringing it before your Lordships’ House and for his introduction. It will place a requirement on Ofcom to produce and publish guidance for providers of Part 3 services in order to make online spaces safer for women and girls. As the noble Baroness, Lady Morgan, has said, while this is not a code of practice—and I will be interested in the distinction between the code of practice that was being called for and what we are expecting now—it would be helpful perhaps to know when we might expect to see it. As the noble Baroness, Lady Burt, just asked, what kind of timescale is applicable?
This is very much a significant step for women and girls, who deserve and seek specific protections because of the disproportionate amount of abuse received. It is crucial that the guidance take a holistic approach which focuses on prevention and tech accountability, and that it is as robust as possible. Can the Minister say whether he will be looking to the model of the Violence against Women and Girls Code of Practice, which has been jointly developed by a number of groups and individuals including Glitch, the NSPCC, 5Rights and Refuge? It is important that this be got right, that we see it as soon as possible and that all the benefits can be felt and seen.