(1 year, 3 months ago)
Lords ChamberMy Lords, I, too, thank the Minister for the great improvements that the Government have made to the Secretary of State’s powers in the Bill during its passage through this House. I rise to speak briefly today to praise the Government’s new Amendments 1 and 2 to Clause 44. As a journalist, I was worried by the lack of transparency around these powers in the clause; I am glad that the lessons of Section 94 of the Telecommunications Act 1984, which had to be rescinded, have been learned. In a world of conspiracy theories that can be damaging to public trust and governmental and regulatory process, it has never been more important that Parliament and the public are informed about the actions of government when giving directions to Ofcom about the draft codes of practice. So I am glad that these new amendments resolve those concerns.
My Lords, I welcome Amendments 5 and 6, as well as the amendments that reflect the work done and comments made in earlier stages of this debate by the noble Baroness, Lady Kennedy. Of course, we are not quite there yet with this Bill, but we are well on the way as this is the Bill’s last formal stage in this Chamber before it goes back to the House of Commons.
Amendments 5 and 6 relate to the categorisation of platforms. I do not want to steal my noble friend’s thunder, but I echo the comments made about the engagement both from my noble friend the Minister and from the Secretary of State. I am delighted that the indications I have received are that they will accept the amendment to Schedule 11, which this House voted on just before the Recess; that is a significant and extremely welcome change.
When commentators outside talk about the work of a revising Chamber, I hope that this Bill will be used as a model for cross-party, non-partisan engagement in how we make a Bill as good as it possibly can be—particularly when it is as ground-breaking and novel as this one is. My noble friend the Minister said in a letter to all of us that this Bill had been strengthened in this Chamber, and I think that is absolutely right.
I also want to echo thanks to the Bill team, some of whom I was working with four years ago when we were talking about this Bill. They have stuck with the Bill through thick and thin. Also, I thank noble Lords across the House for their support for the amendments but also all of those outside this House who have committed such time, effort, support and expertise to making sure this Bill is as good as possible. I wish it well with its final stages. I think we all look forward to both Royal Assent and also the next big challenge, which is implementation.
My Lords, I thank the Minister for his introduction today and also for his letter which set out the reasons and the very welcome amendments that he has tabled today. First, I must congratulate the noble Baroness, Lady Stowell, for her persistence in pushing amendments of this kind to Clause 45, which will considerably increase the transparency of the Secretary of State’s directions if they are to take place. They are extremely welcome as amendments to Clause 45.
Of course, there is always a “but”—by the way, I am delighted that the Minister took the advice of the House and clearly spent his summer reading through the Bill in great deal, or we would not have seen these amendments, I am sure—but I am just sorry that he did not take the opportunity also to address Clause 176 in terms of the threshold for powers to direct Ofcom in special circumstances, and of course the rather burdensome powers in relation to the Secretary of State’s guidance on Ofcom’s exercise of its functions under the Bill as a whole. No doubt we will see how that works out in practice and whether they are going to be used on a frequent basis.
My noble friend Lord Allan—and I must congratulate both him and the noble Lord, Lord Knight, for their addressing this very important issue—has set out five assurances that he is seeking from the Minister. I very much hope that the Minister can give those today, if possible.
Congratulations are also due to the noble Baroness, Lady Kennedy, for finding a real loophole in the offence, which has now been amended. We are all delighted to see that the point has been well taken.
Finally, on the point raised by the noble Lord, Lord Rooker, clearly it is up to the Minister to respond to the points made by the committee. All of us would have preferred to see a comprehensive scheme in the primary legislation, but we are where we are. We wanted to see action on apps; they have some circumscribing within the terms of the Bill. The terms of the Bill—as we have discussed—particularly with the taking out of “legal but harmful”, do not give a huge amount of leeway, so this is not perhaps as skeleton a provision as one might otherwise have thought. Those are my reflections on what the committee has said.
(1 year, 5 months ago)
Lords ChamberMy Lords, I am grateful to hear what the Minister has just announced. The scheme that was originally prefigured in the pre-legislative scrutiny report has now got some chance of being delivered. I think the process and procedures are quite appropriate; it does need review and thought. There needs to be account taken of practice on the ground, how people have found the new system is working, and whether or not there are gaps that can be filled this way. I give my full support to the proposal, and I am very glad to see it.
Having got to the Dispatch Box early, I will just appeal to our small but very important group. We are on the last day on Report. We are reaching a number of issues where lots of debate has taken place in Committee. I think it would be quite a nice surprise for us all if we were to get through this quickly. The only way to do that is by restricting our contributions.
My Lords, I will speak briefly to Amendments 272AA and 274AA, only because at the previous stage of the Bill I tabled amendments related to the reporting of illegal content and fraudulent advertisements, both in reporting, and complaints and transparency. I have not re-tabled them here, but I have had conversations with my noble friend the Minister. It is still unclear to those in the House and outside why the provisions relating to that type of reporting would not apply to fraudulent advertisements, particularly given that the more information that can be filed about those types of scams and fraudulent advertisements, the easier it would be for the platforms to gather information, and help users and others to start to crack down on that. I wonder if, when he sums up, my noble friend could say something about the reporting provisions relating to fraudulent advertisements generally, and in particular around general reporting and reporting relating to complaints by users.
My Lords, as I was eagerly anticipating, government Amendments 238A and 238D seek to change the parliamentary process for laying the first regulations specifying the category 1 threshold conditions from the negative to the affirmative procedure. I am pleased to bring forward this change in response to the recommendation of your Lordships’ Delegated Powers and Regulatory Reform Committee.
The change will ensure that there are adequate levels of parliamentary scrutiny of the first regulations specifying the category 1 threshold conditions. This is appropriate given that the categorisation of category 1 services will lead to the most substantial duties on the largest and most influential services. As noble Lords are aware, these include the duties on user empowerment, user identity verification, journalistic and news publisher content, content of democratic importance, and fraudulent advertising.
Category 2A services will have only additional transparency and fraudulent advertising duties, and category 2B services will be subject only to additional transparency reporting duties. The burden of these duties is significantly less than the additional category 1 duties, and we have therefore retained the use of the negative resolution procedure for these regulations, as they require less parliamentary scrutiny.
Future changes to the category 1 threshold conditions will also use the negative procedure. This will ensure that the regime remains agile in responding to change, which I know was of particular concern to noble Lords when we debated the categorisation group in Committee. Keeping the negative procedure for such subsequent uses will avoid the risk of future changes being subject to delays because of parliamentary scheduling. I beg to move.
My Lords, I shall speak to Amendment 245. I would like to thank my noble friend the Minister, and also the Minister on leave, for the conversations that I have had with them about this amendment and related issues. As we have already heard, the platform categorisation is extremely important. So far, much of it is unknown, including which sites are actually going to be in which categories. For example, we have not yet seen any proposed secondary regulations. As my noble friend has just outlined, special duties apply, especially for those sites within category 1—user empowerment in particular, but also other duties relating to content and fraudulent advertisements.
Clause 85 and Schedule 11 set out the thresholds for determining which sites will be in category 1, category 2A or category 2B. I am very mindful of the exhortation of the noble Lord, Lord Stevenson, about being brief, but it is amazing how much you have to say about one word to explain this amendment. This amendment proposes to change an “and” to an “or” in relation to determining which sites would fall within category 1. It would move from a test of size “and” functionality to a test of size “or” functionality. This would give Ofcom more flexibility to decide which platforms really need category 1 designation. Category 1 should not be decided just on size; it should also be possible to determine it on the basis of functionality.
Functionality is defined in the Bill in Clause 208. We will get to those amendments shortly, but there is no doubt from what the Government have already conceded, or agreed with those of us who have been campaigning passionately on the Bill for a number of years, that functionality can make a platform harmful. It is perfectly possible to have small platforms that both carry highly harmful content and themselves become harmful in the way that they are designed. We have heard many examples and I will not detain the House with them, but I draw attention to two particular sites which capture how broad this is. The perpetrators of offline hate crimes are often linked to these small platforms. For example, the perpetrator of the 2018 Tree of Life synagogue mass shooting had an online presence on the right-wing extremist social network Gab. In the UK, Jake Davison, the self-proclaimed incel who killed five people in Plymouth in 2021, frequented smaller incel forums after he was banned from Reddit in the days leading up to the mass shooting.
I also want to share with noble Lords an email that I received just this week from a family who had been to see their Member of Parliament, Matt Rodda MP, and also the noble Baroness, Lady Kidron, who I know is very regretful that she cannot be here today. I thank Victoria and Jean Eustace for sharing the story of their sister and daughter. Victoria wrote: “I am writing to you regarding the Online Safety Bill, as my family and I are concerned it will not sufficiently protect vulnerable adults from harm. My sister, Zoe Lyalle, killed herself on 26 May 2020, having been pointed towards a method using an online forum called Sanctioned Suicide. Zoe was 18 years old at the time of her death and as such technically an adult, but she was autistic, so she was emotionally less mature than many 18 year- olds. She found it difficult to critically analyse written content”. She says that “The forum in question is not large and states on its face that it does not encourage suicide, although its content does just that”. The next part I was even more shocked about: “Since Zoe’s death, we have accessed her email account. The forum continues to email Zoe, providing her with updates on content she may have missed while away from the site, as well as requesting donations. One recent email included a link to a thread on the forum containing tips on how best to use the precise method that Zoe had employed”.
In her note to me, the Minister on leave said that she wanted to catch some of the platforms we are talking about with outsized influence. In my reply, I said that those sites on which people are encouraged to take their own lives or become radicalised and therefore take the harms they are seeing online into the real world undoubtedly exercise influence and should be tackled.
It is also perfectly possible for us to have large but safe platforms. I know that my noble friend Lord Moylan may want to discuss this in relation to sites that he has talked about already on this Bill. The risk of the current drafting is a flight of users from these large platforms, newly categorised as category 1, to the small, non-category 1 platforms. What if a platform becomes extremely harmful very quickly? How will it be recategorised speedily but fairly and involving parliamentary oversight?
The Government have run a variety of arguments as to why the “and” in the Bill should not become an “or”. They say that it creates legal uncertainty. Every Bill creates legal uncertainty; that is why we have an army of extremely highly paid lawyers, not just in this country but around the world. They say that what we are talking about is broader than illegal content or content related to children’s safety, but they have already accepted an earlier amendment on safety by design and, in subsections (10) to (12) of Clause 12, that specific extra protections should be available for content related to
“suicide or an act of deliberate self-injury, or … an eating disorder or behaviours associated with an eating disorder”
or abusive content relating to race, religion, sex, sexual orientation, disability or gender reassignment and that:
“Content is within this subsection if it incites hatred against people”.
The Government have already breached some of their own limits on content that is not just illegal or relates to child safety duties. In fact, they have agreed that that content should have enhanced triple-shield protection.
The Government have also said that they want to avoid burdens on small but low-harm platforms. I agree with that, but with an “or” it would be perfectly possible for Ofcom to decide by looking at size or functionality and to exclude those smaller platforms that do not present the harm we all care about. The Minister may also offer me a review of categorisation; however, it is a review of the tiers of categorisation and not the sites within the categories, which I think many of us will have views on over the years.
I come to what we should do on this final day of Report. I am very thankful to those who have had many conversations on this, but there is a fundamental difference of opinion in this House on these matters. We will talk about functionality shortly and I am mindful of the pre-legislative scrutiny committee’s recommendation that this legislation should adopt
“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.
There should be other factors. Ofcom should have the ability to decide whether it takes one factor or another, and not have a series of all the thresholds to be passed, to give it the maximum flexibility. I will listen very carefully to what my noble friend the Minister and other noble Lords say, but at this moment I intend to test the opinion of the House on this amendment.
My Lords, I strongly support Amendment 245. The noble Baroness, Lady Morgan of Cotes, has explained the nub of the problem we are facing—that size and functionality are quite separate. You can have large sites that perform a major social function and are extremely useful across society. Counter to that, you can have a small site focused on being very harmful to a small group of people. The problem is that, without providing the flexibility to Ofcom to determine how the risk assessment should be conducted, the Bill would lock it into leaving these small, very harmful platforms able to pursue their potentially ever-increasingly harmful activities almost out of sight. It does nothing to make sure that their risk assessments are appropriate.
My Lords, I wish to test the opinion of the House and I beg to move.
My Lords, the hour is late and I will not detain the House for long. However, I hope that the fact that we are all still sitting here at the end of a long Report stage, because we care very much about the Bill and what we are trying to achieve, will be noted by my noble friend the Minister, his officials and others who are watching. I thank my noble friend Lady Harding for so ably introducing the amendments, which I absolutely support. I was, perhaps for the first time, going to agree with something the noble Baroness, Lady Fox, said a day or so ago: that one thing we and Ofcom need to do much better is to understand the transparency of the algorithms. It is not just algorithms—this is where my knowledge ends—but other design features that make these sites addictive and harmful, and which are outside content. The Bill will not be capable of addressing even the next five years, let alone beyond that, if we do not reflect the fact that, as my noble friend Lady Harding said, it has already been amended so that one way its objectives are to be achieved is by services being required to focus on safety by design.
I hope very much that my noble friend will take up the invitation, because everybody is tired and has been looking at this Bill for so many hours and months that we are probably all word-blind. We could all do with standing back and thinking, “With the amendments made, how does it all hang together so that ultimately, we keep those we want to keep safe as safe as we possibly can?” On that basis, I support these amendments and look forward to hearing further from the Government about how they hope to keep safe those we all wish to keep safe.
My Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.
These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.
As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.
If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.
(1 year, 5 months ago)
Lords ChamberMy Lords, as we discussed in Committee, the Bill contains strong protection for women and girls and places duties on services to tackle and limit the kinds of offences and online abuse that we know disproportionately affect them. His Majesty’s Government are committed to ensuring that women and girls are protected online as well as offline. I am particularly grateful to my noble friend Lady Morgan of Cotes for the thoughtful and constructive way in which she has approached ensuring that the provisions in the Bill are as robust as possible.
It is with my noble friend’s support that I am therefore pleased to move government Amendment 152. This will create a new clause requiring Ofcom to produce guidance that summarises, in one clear place, measures that can be taken to tackle the abuse that women and girls disproportionately face online. This guidance will relate to regulated user-to-user and search services and will cover content regulated under the Bill’s frame- work. Crucially, it will summarise the measures in the Clause 36 codes for Part 3 duties, namely the illegal and child safety duties. It will also include a summary of platforms’ relevant Part 4 duties—for example, relevant terms of service and reporting provisions. This will provide a one-stop shop for providers.
Providers that adhere to the codes of practice will continue to be compliant with the duties. However, this guidance will ensure that it is easy and clear for platforms to implement holistic and effective protections for women and girls across their various duties. Any company that says it is serious about protecting women and girls online will, I am sure, refer to this guidance when implementing protections for its users.
Ofcom will have the flexibility to shape the guidance in a way it deems most effective in protecting women and girls online. However, as outlined in this amendment, we expect that it will include examples of best practice for assessing risks of harm to women and girls from content and activity, and how providers can reduce these risks and emphasise provisions in the codes of practice that are particularly relevant to the protection of women and girls.
To ensure that this guidance is effective and makes a difference, the amendment creates a requirement on Ofcom to consult the Domestic Abuse Commissioner and the Victims’ Commissioner, among other people or organisations it considers appropriate, when it creates this guidance. Much like the codes of practice, this will ensure that the views and voices of experts on the issue, and of women, girls and victims, are reflected. This amendment will also require Ofcom to publish this guidance.
I am grateful to all the organisations that have worked with us and with my noble friend Lady Morgan to get to this point. I hope your Lordships will accept the amendment. I beg to move.
My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.
As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.
My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.
As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.
There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.
I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.
My Lords, I know that we do not have long and I do not want to be churlish. I am not that keen on this amendment, but I want to ask a question in relation to it.
I am concerned that there should be no conflation in the best practice guidance between the actual, practical problems of, for example, victims of domestic abuse being stalked online, which is a threat to their safety, or threatened with physical violence—I understand that—and abuse. Abuse is horrible to be on the receiving end of, but it is important for freedom of thought and freedom of speech that we do not make no distinction between words and action. It is important not to overreact or frighten young women by saying that being shouted at is the same as being physically abused.
(1 year, 5 months ago)
Lords ChamberMy Lords, I also support the amendments from the noble Baroness, Lady Kidron. It is relatively easy to stand here and make the case for age verification for porn: it is such a black and white subject and it is disgusting pornography, so of course children should be protected from it. Making the case for the design of the attention economy is more subtle and complex—but it is incredibly important, because it is the attention economy that is driving our children to extreme behaviours.
I know this from my own personal life; I enjoy incredibly lovely online content about wild-water swimming, and I have been taken down a death spiral towards ice swimming and have become a compulsive swimmer in extreme temperatures, partly because of the addiction generated by online algorithms. This is a lovely and heart-warming anecdote to give noble Lords a sense of the impact of algorithms on my own imagination, but my children are prone to much more dangerous experiences. The plasticity of their brains is so much more subtle and malleable; they are, like other children, open to all sorts of addiction, depression, sleeplessness and danger from predators. That is the economy that we are looking at.
I point noble Lords to the intervention from the surgeon general in America, Admiral Vivek Murthy—an incredibly impressive individual whom I came across during the pandemic. His 25-page report on the impact of social media on the young of America is incredibly eye-opening reading. Some 95% of American children have come across social media, and one-third of them see it almost constantly, he says. He attributes to the impact of social media depression, anxiety, compulsive behaviours and sleeplessness, as well as what he calls the severe impact on the neurological development of a generation. He calls for a complete bar on all social media for the under-13s and says that his own children will never get anywhere near a mobile phone until they are 16. That is the state of the attention economy that the noble Baroness, Lady Kidron, talks about, and that is the state of the design of our online applications. It is not the content itself but the way in which it is presented to our children, and it traps their imagination in the kind of destructive content that can lead them into all kinds of harms.
Admiral Murthy calls on legislators to act today—and that was followed on the same day by a commitment from the White House to look into this and table legislation to address the kind of design features that the noble Baroness, Lady Kidron, is looking at. I think that we should listen to the surgeon general in America and step up to the challenge that he has given to American legislators. I am enormously grateful to my noble friend the Minister for the incredible amount of work that he has already done to try to bridge the gap in this matter, but there is a way to go. Like my noble friend Lady Harding, I hope very much indeed that he will be able to tell us that he has been able to find a way across the gap, or else I shall be supporting the noble Baroness, Lady Kidron, in her amendment.
I rise briefly to speak to this group of amendments. I want to pick up where my noble friend Lord Bethell has just finished. The Government have listened hugely on this Bill and, by and large, the Bill, and the way in which Ministers have engaged, is a model of how the public wants to see their Parliament acting: collaboratively and collegiately, listening to each other and with a clear sense of purpose that almost all of us want to see the Bill on the statute book as soon as possible. So I urge my noble friend the Minister to do so again. I know that there have been many conversations and I think that many of us will be listening with great care to what he is about to say.
There are two other points that I wanted to mention. The first is that safety by design was always going to be a critical feature of the Bill. I have been reminding myself of the discussions that I had as Culture Secretary. Surely and in general, we want to prevent our young people in particular encountering harms before they get there, rather than always having to think about the moderation of harmful content once it has been posted.
Secondly, I would be interested to hear what the Minister has to say about why the Government find it so difficult to accept these amendments. Has there been some pushback from those who are going to be regulated? That would suggest that, while they can cope with the regulation of content, there is still secrecy surrounding the algorithms, functionalities and behaviours. I speak as the parent of a teenager who, if he could, would sit there quite happily looking at YouTube. In fact, he may well be doing that now—he certainly will not be watching his mother speaking in this House. He may well be sitting there and looking at YouTube and the content that is served up automatically, time after time.
I wonder whether this is, as other noble Lords have said, an opportunity. If we are to do the Bill properly and to regulate the platforms—and we have decided we need to do that—we should do the job properly and not limit ourselves to content. I shall listen very carefully to what my noble friend says but, with regret, if there is a Division, I will have to support the indomitable noble Baroness, Lady Kidron, as I think she was called.
My Lords, I very strongly support the noble Baroness, Lady Kidron, in her Amendments 35, 36 and 281F and in spirit very much support what the noble Lord, Lord Russell, said in respect of his amendments. We have heard some very powerful speeches from the noble Baroness, Lady Kidron, herself, from the noble Baronesses, Lady Harding and Lady Morgan, from the right reverend Prelate the Bishop of Oxford, from my noble friend Lady Benjamin and from the noble Lords, Lord Russell and Lord Bethell. There is little that I can add to the colour and the passion that they brought to the debate today.
As the noble Baroness, Lady Kidron, started by saying that it is not just about content; it is about functionalities, features and behaviours. It is all about platform design. I think the Government had pretty fair warning throughout the progress of the Bill that we would be keen to probe this. If the Minister looks back to the Joint Committee report, he will see that there was a whole chapter titled “Societal harm and the role of platform design”. I do not think we could have been clearer about what we wanted from this legislation. One paragraph says:
“We heard throughout our inquiry that there are design features specific to online services that create and exacerbate risks of harm. Those risks are always present, regardless of the content involved, but only materialise when the content concerned is harmful”.
It goes on to give various examples and says:
“Tackling these design risks is more effective than just trying to take down individual pieces of content (though that is necessary in the worst cases). Online services should be identifying these design risks and putting in place systems and process to mitigate them before people are harmed”.
That is the kind of test that the committee put. It is still valid today. As the noble Baroness said, platforms are benefiting from the network effect, and the Threads platform is an absolutely clear example of how that is possible.
The noble Lord, Lord Russell, gave us a very chilling example of the way that infinite scrolling worked for Milly. A noble Lord on the Opposition Bench, a former Home Secretary whose name I momentarily forget, talked about the lack of empathy of AI in these circumstances. The algorithms can be quite relentless in pushing this content; they lack human qualities. It may sound over the top to say that, but that is exactly what we are trying to legislate for. As the noble Lord, Lord Russell, says, just because we cannot always anticipate what the future holds, there is no reason why we should not try. We are trying to future-proof ourselves as far as possible, and it is not just the future but the present that we are trying to proof against through these amendments. We know that AI and the metaverse are coming down the track, but there are present harms that we are trying to legislate for as well. The noble Baroness, Lady Kidron, was absolutely right to keep reminding us about Molly Russell. It is this kind of algorithmic amplification that is so dangerous to our young people.
The Minister has a chance, still, to accede to these amendments. He has heard the opinion all around the House. It is rather difficult to understand what the Government’s motives are. The noble Baroness, Lady Morgan, put her finger on it: why is it so difficult to accede to these? We have congratulated the Government, the Minister and the Secretary of State throughout these groups over the last day and a bit; they have been extremely consensual and have worked very hard at trying to get agreement on a huge range of issues. Most noble Lords have never seen so many government amendments in their life. So far, so good; why ruin it?
(1 year, 5 months ago)
Lords ChamberMy Lords, first, I welcome the amendment from the noble Lord, Lord Allan, and his motivation, because I am concerned that, throughout the Bill, the wrong targets are being caught up. I was grateful to hear his recognition that people who talk about their problems with self-harm could end up being targeted, which nobody would ever intend. These things need to be taken seriously.
In that sense, I was slightly concerned about the motivation of the noble Baroness, Lady Burt of Solihull, in the “reckless” amendment. The argument was that the recklessness standard is easier to prove. I am always worried about things that make it easier to prosecute someone, rather than there being a just reason for that prosecution. As we know, those involved in sending these images are often immature and very foolish young men. I am concerned about lowering the threshold at which we criminalise them—potentially destroying their lives, by the way, because if you have a criminal record it is not good—even though I in no way tolerate what they are doing and it is obviously important that we take that on.
There is a danger that this law will become a mechanism through which people try to resolve a whole range of social problems—which brings me on to responding to the speech just made by the noble Baroness, Lady Kennedy of The Shaws. I continue to be concerned about the question of trying to criminalise indirect threats. The point about somebody who sends a direct threat is that we can at least see the connection between that direct threat and the possibility of action. It is the same sort of thing that we have historically considered in relation to incitement. I understand that, where your physical being is threatened by words, physically a practical thing can happen, and that is to be taken very seriously. The problem I have is with the indirect threat from somebody who says, for example, “That smile should be taken of your face. It can be arranged”, or other indirect but incredibly unpleasant comments. There is clearly no link between that and a specific action. It might use violent language but it is indirect: “It could be arranged”, or “I wish it would happen”.
Anyone on social media—I am sure your Lordships all are—will know that I follow very carefully what people from different political parties say about each other. I do not know if you have ever followed the kind of things that are said about the Government and their Ministers, but the threats are not indirect and are often named. In that instance, it is nothing to do with women, but it is pretty violent and vile. By the way, I have also followed what is said about the Opposition Benches, and that can be pretty violent and vile, including language that implies that they wish those people were the subject of quite intense violence—without going into detail. That happens, and I do not approve of it—obviously. I also do not think that pile-ons are pleasant to be on the receiving end of, and I understand how they happen. However, if we criminalise pile-ons on social media, we are openly imposing censorship.
What is worse in my mind is that we are allowing the conflation of words and actions, where what people say or think is the same as acting on it, as the criminal law would see it. We have seen a very dangerous trend recently, which is particularly popular in the endless arguments and disputes over identity politics, where people will say that speech is violence. This has happened to a number of gender-critical feminists, in this instance women, who have gone in good faith to speak at universities, having been invited. They have been told that their speech was indistinguishable from violence and that it made students at the university feel under threat and unsafe and that it was the equivalent of being attacked. But guess what? Once you remove that distinction, the response to that speech can be to use violence, because you cannot tell the difference between them. That has happened around a number of university actions, where speakers and their supporters were physically assaulted by people who said that they were using self-defence against speech that was violent. I get nervous that this is a slippery slope, and we certainly should not go anywhere near it in legislation.
Finally, I agree that we should tackle the culture of people piling on and using this kind of language, but it is a cultural and social question. What we require is moral leadership and courage in the face of it—calling it out, arguing against it and so on. It is wrong to use the law to send messages; it is an abdication of moral leadership and a cop-out, let alone dangerous in what is criminalised. I urge your Lordships to reject those amendments.
My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.
It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.
I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.
I look forward to hearing from the Minister about how this area of law will be kept under review.
My Lords, I understand that, as this is a new stage of the Bill, I have to declare my interests: I am the chair of 5Rights Foundation, a charity that works around technology and children; I am a fellow at the computer science department at Oxford University; I run the Digital Futures Commission, in conjunction with the 5Rights Foundation and the London School of Economics; I am a commissioner on the Broadband Commission; I am an adviser for the AI ethics institute; and I am involved in Born in Bradford and the Lancet commission, and I work with a broad number of civil society organisations.
I also welcome these amendments and want to pay tribute to Maria Miller in the other place for her work on this issue. It has been extraordinary. I too was going to raise the issue of the definition of “photograph”, so perhaps the Minister could say or, even better, put it in the Bill. It does extend to those other contexts.
My main point is about children. We do not want to criminalise children, but this is pervasive among under-18s. I do want to make the distinction between those under-18s who intentionally harm another under-18 and have to be responsible for what they have done in the meaning of the law as the Minister set it out, and those who are under the incredible pressure—I do not mean coercion, because that is another out-clause—of oversharing that is inherent in the design of many of these services. That is an issue I am sure we are going to come back to later today. I would love to hear the Minister say something about the Government’s intention from the Dispatch Box: that it is preventive first and there is a balance between education and punishment for under-18s who find themselves unavoidably in this situation.
Very briefly, before I speak to these amendments, I want to welcome them. Having spoken to and introduced some of the threats of sharing intimate images under the Domestic Abuse Act 2021, I think it is really welcome that everything has been brought together in one place. Again, I pay tribute to the work of Dame Maria Miller and many others outside who have raised these as issues. I also want to pay tribute to the Ministry of Justice Minister Edward Argar, who has also worked with my noble friend the Minister on this.
I have one specific question. The Minister did mention this in his remarks, but could he be absolutely clear that these amendments do not mention specifically the lifetime anonymity of claimants and the special measures in relation to giving evidence that apply to witnesses. That came up in the last group of amendments as well. Because they are not actually in this drafting, it would be helpful if he could put on record the relationship with the provisions in the Sexual Offences Act 2003. I know that would be appreciated by campaigners.
My Lords, I have very little to add to the wise words that we have heard from my noble friend and from the noble Baronesses, Lady Kidron and Lady Morgan. We should thank all those who have got us to this place, including the Law Commission. It was a separate report. In that context, I would be very interested to hear a little more from the Minister about the programme of further offences that he mentioned. The communication offences that we have talked about so far are either the intimate images offences, which there was a separate report on, or other communications offences, which are also being dealt with as part of the Bill. I am not clear what other offences are in the programme.
Finally, the Minister himself raised the question of deepfakes. I have rustled through the amendments to see exactly how they are caught. The question asked by the noble Baroness, Lady Kidron, is more or less the same but put a different way. How are these deepfakes caught in the wording that is now being included in the Bill? This is becoming a big issue and we must be absolutely certain that it is captured.
(1 year, 5 months ago)
Lords ChamberIs he not outrageous, trying to make appeals to one’s good humour and good sense? But I support him.
I will say only three things about this brief but very useful debate. First, I welcome the toggle-on, toggle-off resolution: that is a good move. It makes sure that people make a choice and that it is made at an appropriate time, when they are using the service. That seems to be the right way forward, so I am glad that that has come through.
Secondly, I still worry that terms of service, even though there are improved transparency measures in these amendments, will eventually need some form of power for Ofcom to set de minimis standards. So much depends on the ability of the terms of service to carry people’s engagement with the social media companies, including the decisions about what to see and not to see, and about whether they want to stay on or keep off. Without some power behind that, I do not think that the transparency will take it. However, we will leave it as it is; it is better than it was before.
Thirdly, user ID is another issue that will come back. I agree entirely with what the noble Lord, Lord Clement-Jones, said: this is at the heart of so much of what is wrong with what we see and perceive as happening on the internet. To reduce scams, to be more aware of trolls and to be aware of misinformation and disinformation, you need some sense of who you are talking to, or who is talking to you. There is a case for having that information verified, whether or not it is done on a limited basis, because we need to protect those who need to have their identities concealed for very good reason—we know all about that. As the noble Lord said, it is popular to think that you would be a safer person on the internet if you were able to identify who you were talking to. I look forward to hearing the Minister’s response.
My Lords, I will speak very briefly to Amendments 55 and 182. We are now at the stage of completely taking the lead from the Minister and the noble Lords opposite—the noble Lords, Lord Stevenson and Lord Clement-Jones—that we have to accept these amendments, because we need now to see how this will work in practice. That is why we all think that we will be back here talking about these issues in the not too distant future.
My noble friend the Minister rightly said that, as we debated in Committee, the Government made a choice in taking out “legal but harmful”. Many of us disagree with that, but that is the choice that has been made. So I welcome the changes that have been made by the Government in these amendments to at least allow there to be more empowerment of users, particularly in relation to the most harmful content and, as we debated, in relation to adult users who are more vulnerable.
It is worth reminding the House that we heard very powerful testimony during the previous stage from noble Lords with personal experience of family members who struggle with eating disorders, and how difficult these people would find it to self-regulate the content they were looking at.
In Committee, I proposed an amendment about “toggle on”. Anyone listening to this debate outside who does not know what we are talking about will think we have gone mad, talking about toggle on and toggle off, but I proposed an amendment for toggle on by default. Again, I take the Government’s point, and I know my noble friend has put a lot of work into this, with Ministers and others, in trying to come up with a sensible compromise.
I draw attention to Amendment 55. I wonder if my noble friend the Minister is able say anything about whether users will be able to have specific empowerment in relation to specific types of content, where they are perhaps more vulnerable if they see it. For example, the needs of a user might be quite different between those relating to self-harm and those relating to eating disorder content or other types of content that we would deem harmful.
On Amendment 182, my noble friend leapt immediately to abusive content coming from unverified users, but, as we have heard, and as I know, having led the House’s inquiry into fraud and digital fraud last year, there will be, and already is, a prevalence of scams. The Bill is cracking down on fraudulent advertisements but, as an anti-fraud measure, being able to see whether an account has been verified would be extremely useful. The view now is that, if this Bill is successful—and we hope it is—in cracking down on fraudulent advertising, then there will be even more reliance on what is called organic reach, which is the use of fake accounts, where verification therefore becomes more important. We have heard from opinion polling that the public want to see which accounts are or are not verified. We have also heard that Amendment 182 is about giving users choice, in making clear whether their accounts are verified; it is not about compelling people to say whether they are verified or not.
As we have heard, this is a direction of travel. I understand that the Government will not want to accept these amendments at this stage, but it is useful to have this debate to see where we are going and what Ofcom will be looking at in relation to these matters. I look forward to hearing what my noble friend the Minister has to say about these amendments.
My Lords, I speak to Amendment 53, on the assessment duties, and Amendment 60, on requiring services to provide a choice screen. It is the first time we have seen these developments. We are in something of a see-saw process over legal but harmful. I agree with my noble friend Lord Clement-Jones when he says he regrets that it is no longer in the Bill, although that may not be a consistent view everywhere. We have been see-sawing backwards and forwards, and now, like the Schrödinger’s cat of legal but harmful, it is both dead and alive at the same time. Amendments that we are dealing with today make it a little more alive that it was previously.
In this latest incarnation, we will insist that category 1 services carry out an assessment of how they will comply with their user-empowerment responsibility. Certainly, this part seems reasonable to me, given that it is limited to category 1 providers, which we assume will have significant resources. Crucially, that will depend on the categorisations—so we are back to our previous debate. If we imagine category 1 being the Meta services and Twitter, et cetera, that is one thing, but if we are going to move others into category 1 who would really struggle to do a user empowerment tool assessment—I have to use the right words; it is not a risk assessment—then it is a different debate. Assuming that we are sticking to those major services, asking them to do an assessment seems reasonable. From working on the inside, I know that even if it were not formalised in the Bill, they would end up having to do it as part of their compliance responsibilities. As part of the Clause 8 illegal content risk assessment, they would inevitably end up doing that.
That is because the categories of content that we are talking about in Clauses 12(10) to (12) are all types of content that might sometimes be illegal and sometimes not illegal. Therefore, if you were doing an illegal content risk assessment, you would have to look at it, and you would end up looking at types of content and putting them into three buckets. The first bucket is that it is likely illegal in the UK, and we know what we have to do there under the terms of the Bill. The second is that it is likely to be against your terms of service, in which case you would deal with it there. The third is that it is neither against your terms of service nor against UK law, and you would make a choice about that.
I want to focus on what happens once you have done the risk assessment and you have to have the choice screen. I particularly want to focus on services where all the content in Clause 12 is already against their terms of service, so there is no gap. The whole point of this discussion about legal but harmful is imagining that there is going to be a mixed economy of services and, in that mixed economy, there will be different standards. Some will wish to allow the content listed in Clause 12—self-harm-type content, eating disorder content and various forms of sub-criminal hate speech. Some will choose to do that—that is going to be their choice—and they will have to provide the user empowerment tools and options. I believe that many category 1 providers will not want to; they will just want to prohibit all that stuff under their terms of service and, in that case, offering a choice is meaningless. That will not make the noble Lord, Lord Moylan, or the noble Baroness, Lady Fox, very happy, but that is the reality.
Most services will just say that they do not want that stuff on their platform. In those cases, I hope that what we are going to say is that, in their terms of service, when a user joins a service, they can say that they have banned all that stuff anyway, so they are not going to give the user a user empowerment tool and, if the user sees that stuff, they should just report it and it will be taken down under the terms of service. Throughout this debate I have said, “No more cookie banners, please”. I hope that we are not going to require people, in order for them to comply with this law, to offer a screen that people then click through. It is completely meaningless and ineffective. For those services that have chosen under their terms of service to restrict all the content in Clause 12, I hope that we will be saying that their version of the user empowerment tool is not to make people click anything but to provide education and information and tell them where they can report the content and have it taken down.
Then there are those who will choose to protect that content and allow it on their service. I agree with the noble Lord, Lord Moylan, that this is, in some sense, Twitter-focused or Twitter-driven legislation, because Twitter tends to be more in the freedom of speech camp and to allow hate speech and some of that stuff. It will be more permissive than Facebook or Instagram in its terms, and it may choose to maintain that content and it will have to offer that screen. That is fine, but we should not be making services do so when they have already prohibited such content.
The noble Lord, Lord Moylan, mentioned services that use community moderators to moderate part of the service and how this would apply there. Reddit is the obvious example, but there are others. If you are going to have user empowerment—and Reddit is more at the freedom of expression end of things—then if there are some subreddits, or spaces within Reddit that allow hate speech or the kind of speech that is in Clause 12, it would be rational to say that user empowerment in the context of Reddit is to be told that you can join these subreddits and you are fine or you can join those subreddits and you are allowing yourself to be exposed to this kind of content. What would not make sense would be for Reddit to do it individual content item by content item. When we are thinking about this, I hope that the implementation would say that, for a service with community-moderated spaces, and subspaces within the larger community, user empowerment means choosing which subspaces you enter, and you would be given information about them. Reddit would say to the moderators of the subreddits, “You need to tell us whether you have any Clause 12-type content”—I shall keep using that language—“and, if you are allowing it, you need to make sure that you are restricted”. But we should not expect Reddit to restrict every individual content item.
Finally, as a general note of caution, noble Lords may have detected that I am not entirely convinced that these will be hugely beneficial tools, perhaps other than for a small subset of Twitter users, for whom they are useful. There is an issue around particular kinds of content on Twitter, and particular Twitter users, including people in prominent positions in public life, for whom these tools make sense. For a lot of other people, they will not be particularly meaningful. I hope that we are going to keep focused on outcomes and not waste effort on things that are not effective.
As I say, many companies, when they are faced with this, will look at it and say, “I have limited engineering time. I could build all these user empowerment tools or I could just ban the Clause 12 stuff in my terms of service”. That would not be a great outcome for freedom of expression; it might be a good outcome for the people who wanted to prohibit legal but harmful in the first place. You are going to do that as a really hard business decision. It is much more expensive to try to maintain these different regimes and flag all this content and so on. It is simpler to have one set of standards.
(1 year, 6 months ago)
Lords ChamberMy Lords, I am very pleased to support the noble Baroness, Lady Kidron, with these amendments. I also welcome the fact that we have, I hope, reached the final day of this stage of the Bill, which means that it is getting closer to becoming an Act of Parliament. The amendments to these clauses are a very good example of why the Bill needs to become an Act sooner rather than later.
As we heard during our earlier debates, social media platforms have for far too long avoided taking responsibility for the countless harms that children face on their services. We have, of course, heard about Molly Russell’s tragic death and heard from the coroner’s inquest report that it was on Instagram that Molly viewed some of the most disturbing posts. Despite this, at the inquest Meta’s head of health and well-being policy shied away from taking blame and claimed that the posts which the coroner said contributed to Molly’s death
“in a more than minimal way”
were, in Meta’s words, “safe”. Molly’s family and others have to go through the unthinkable when they lose their child in such a manner. Their lives can be made so much harder when they attempt to access their child’s social media accounts and activities only to be denied by the platforms.
The noble Baroness’s various amendments are not only sensible but absolutely the right thing to do. In many ways, it is a great tragedy that we have had to wait for this piece of primary legislation for these companies to start being compelled and told. I understand what the noble Lord, Lord Allan, very rationally said—companies should very much welcome these amendments—but it is a great shame that often they have not behaved better in these circumstances previously.
There is perhaps no point going into the details, because we want to hear from the Minister about what the Government will propose. I welcome the fact that the Government have engaged early-ish on these amendments and on these matters.
The amendments would force platforms to comply with coroners in investigations into the death of a child, have a named senior manager in relation to inquests and allow easier access to a child’s social media account for bereaved families. We will have to see what the Government’s amendments do to reflect that. One of the areas that the noble Baroness said had perhaps not been buttoned down is the responsibility for a named senior manager in relation to an inquest. This is requiring that:
“If Ofcom has issued a notice to a service provider they must name a senior manager responsible for providing material on behalf of the service and to inform that individual of the consequences for not complying”.
The noble Lord, Lord Allan, set out very clearly why having a named contact in these companies is important. Bereaved families find it difficult, if not impossible, to make contact with tech companies: they get lost in the automated systems and, if they are able to access a human being, they are told that the company cannot or will not give that information. We know that different coroners have had widely differing experiences getting information from the social media platforms, some refusing altogether and others obfuscating. Only a couple of companies have co-operated fully, and in only one or two instances. Creating a single point of contact, who understands the law—which, as we have just heard, is not necessarily always straightforward, particularly if it involves different jurisdictions—understands what is technically feasible and has the authority and powers afforded to the regulator will ensure a swifter, more equitable and less distressing process.
I have really set this out because we will obviously hear what the Minister will set out, but if it does not reflect having a named senior manager, then I hope very much that we are able to discuss that between this and the next stage.
Social media platforms have a responsibility to keep their users safe. When they fail, they should be obligated to co-operate with families and investigations, rather than seeking to evade them. Seeing what their child was viewing online before their death will not bring that child back, but it will help families on their journey towards understanding what their young person was going through, and towards seeking justice. Likewise, ensuring that platforms comply with inquests will help to ease the considerable strain on bereaved families. I urge noble Lords to support these amendments or to listen to what the Government say. Hopefully, we can come up with a combined effort to put an end to the agony that these families have been through.
My Lords, I strongly support this group of amendments in the name of the noble Baroness, Lady Kidron, and other noble Lords. I, too, acknowledge the campaign group Bereaved Families for Online Safety, which has worked so closely with the noble Baroness, Lady Kidron, 5Rights and the NSPCC to bring these essential changes forward.
Where a child has died, sadly, and social media is thought to have played a part, families and coroners have faced years of stonewalling, often never managing to access data or information relevant to that death; this adds greatly to their grief and delays the finding of some kind of closure. We must never again see a family treated as Molly Russell’s family was treated, when it took five years of campaigning to get partial sight of material that the coroner found so distressing that he concluded that it contributed to her death in a more than minimal way; nor can it be acceptable for a company to refuse to co-operate, as in the case of Frankie Thomas, where Wattpad failed to provide the material requested by the coroner on the grounds that it is not based within the UK’s jurisdiction. With the threat of a fine of only £1,000 to face, companies feel little need to comply. These amendments would mean that tech companies now had to comply with Ofcom’s information notices or face a fine of up to 10% of their global revenue.
Coroners’ powers must be strengthened by giving Ofcom the duty and power to require relevant information from companies in cases where there is reason to suspect that a regulated service provider may hold information relevant to a child’s death. Companies may not want to face up to the role they have played in the death of a child by their irresponsible recommending and pushing of violent, sexual, depressive and pro-suicide material through algorithmic design, but they need to be made to answer when requested by a coroner on behalf of a bereaved family.
Amendment 215 requires a named senior manager, a concept that I am thankful is already enshrined in the Bill, to receive and respond to an information notice from Ofcom to ensure that a child’s information, including their interactions and behaviour and the actions of the regulated service provider, is preserved and made available. This could make a profound difference to how families will be treated by these platforms in future. Too often in the past, they have been evasive and unco-operative, adding greatly to the inconsolable grief of such bereaved parents. As Molly Russell's father Ian said:
“Having lived through Molly’s extended inquest, we think it is important that in future, after the death of a child, authorities’ access to data becomes … a matter of course”
and
“A more compassionate, efficient and speedy process”.
I was going to ask the Government to accept these amendments but, having listened to the noble Baroness, Lady Kidron, I am looking forward to their proposals. We must ensure that a more humane route for families and coroners to access data relating to the death of a child is at last available in law.
My Lords, I will speak briefly to Amendment 218JA, spoken to by the noble Lord, Lord Allan. My name is attached to it online but has not made it on to the printed version. He introduced it so ably and comprehensively that I will not say much more, but I will be more direct with my noble friend the Minister.
This amendment would remove Clause 133(11). The noble Lord, Lord Allan, mentioned that BT has raised with us—I am sure that others have too—that the subsection gives examples of access facilities, such as ISPs and application stores. However, as the noble Lord said, there are other ways that services could use operating systems, browsers and VPNs to evade these access restriction orders. While it is convention for me to say that I would support this amendment should it be moved at a later stage, this is one of those issues that my noble friend the Minister could take off the table this afternoon—he has had letters about it to which there have not necessarily been replies—just by saying that subsection (11) does not give the whole picture, that there are other services and that it is misleading to give just these examples. Will he clarify at the Dispatch Box and on the record, for the benefit of everyone using the Bill now and in future, what broader services are caught? We could then take the issue off the table on this 10th day of Committee.
My Lords, I will be even more direct than the noble Baroness, Lady Morgan, and seek some confirmation. I understood from our various briefings in Committee that, where content is illegal, it is illegal anywhere in the digital world—it is not restricted simply to user to user, search and Part 5. Can the Minister say whether I have understood that correctly? If I have, will he confirm that Ofcom will be able to use its disruption powers on a service out of scope, as it were, such as a blog or a game with no user-to-user aspect, if it were found to be persistently hosting illegal content?
(1 year, 6 months ago)
Lords ChamberMy Lords, I am very grateful to the noble Baronesses, Lady Parminter and Lady Deech, and the noble Lord, Lord Mann, for their support. After a miscellaneous selection of amendments, we now come back to a group of quite tight amendments. Given the hour, those scheduling the groupings should be very pleased because for the first time we have done all the groups that we set out to do this afternoon. I do not want to tempt fate, but I think we will have a good debate before we head off for a little break from the Bill for a while.
In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.
I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.
I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.
The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.
Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.
(1 year, 6 months ago)
Lords ChamberMy Lords, I am grateful to noble Lords who have added their name to my Amendment 271, which arose out of concerns that there are now seemingly several offences that laudably aim to protect women but are not being enforced effectively. The most notable in this category is the low rate of rape cases that are prosecuted and lead to convictions. The amendment is not affected in theory by the definition of cyberflashing, whether it is in the form recommended by the Law Commission, that of specific intent, rather than being based on consent. However, in practice, if it remains in that specific intent form, then the victim will not be required to go to court. Therefore, in practice the amendment would be more effective if the offence remained on that basis. However, even if the victim on that basis does not need to go to court, someone who has been cyberflashed is, as other noble Lords have mentioned, unlikely to go to the police station to report what has happened.
This amendment is designed to put an obligation on the providers of technology to provide a reporting mechanism on phones and to collate that information before passing it to the prosecuting authorities. The Minister said that there are various issues with how the amendment is currently drafted, such as “the Crown Prosecution Service” rather than “the police”, and perhaps the definition of “providers of internet services” as it may be a different part of the tech industry that is required to collate this information.
Drawing on our discussions on the previous group of amendments regarding the criminal law here, I hope that my noble friend can clarify the issues of intent, which is mens rea and different from motive in relation to this matter. The purpose of the amendment is to ensure that there will be resources and expertise from the technology sector to provide these reporting mechanisms for the offences. One can imagine how many people will report cyberflashing if they only have to click on an app, or if their phone is enabled to retain such an image, since some of them disappear after a short while. You should be able to sit on the bus and report it. The tech company would then store and collate that, potentially in a manner that it would become clear. For instance—because this happens so much as we have just heard—if six people on the 27 bus multiple times a week report that they have received the same image, that would prompt the police to get the CCTV from the bus company to identify who this individual is if the tech company data did not provide that specificity. Or, is someone hanging out every Friday night at the A&E department and cyberflashing as they sit there? This is not part of the amendment, but such an app or mechanism could also include a reminder to change the security settings on your phone so that you cannot be AirDropped.
I hope that His Majesty’s Government will look at the purpose of this amendment. It is laudable that we are making cyberflashing an offence, but this amendment is about the enforcement of that offence and will support that. Only with such an easy mechanism to report it can what will be a crime be effectively policed.
My Lords, I, too, wish the noble Baroness, Lady Featherstone, a very speedy recovery. Her presence here today is missed, though the amendments were very ably moved by the noble Baroness, Lady Burt. Having worked in government with the noble Baroness, Lady Featherstone, I can imagine how frustrated she is at not being able to speak today on amendments bearing her name.
As my noble friend said, this follows our debate on the wider issues around violence against women and girls in the online world. I do not want to repeat anything that was said there, but I am grateful to him for the discussions that we have had since. I support the Government in their introduction of Amendment 135A and the addition of controlling or coercive behaviour to the priority offences list. I will also speak to the cyberflashing amendments and Amendment 271, introduced by my noble friend Lady Berridge.
I suspect that many of us speaking in this debate today have had briefings from the wonderful organisation Refuge, which has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As a result of this, Refuge pioneered a specialist technology-facilitated domestic abuse team, which uses expertise to support survivors and to identify emerging trends of online domestic abuse.
I draw noble Lords’ attention to a publication released since we debated this last week: the National Police Chiefs’ Council’s violence against women and girls strategic threat risk assessment for 2023, in which a whole page is devoted to tech and online-enabled violence against women and girls. In its conclusions, it says that one of the key threats is tech-enabled VAWG. The fact that we are having to debate these specific offences, but also the whole issue of gendered abuse online, shows how huge an issue this is for women and girls.
I will start with Amendment 271. I entirely agree with my noble friend about the need for specific user reporting and making that as easy as possible. That would support the debate we had last week about the code of practice, which would generally require platforms and search engines to think from the start how they will enable those who have been abused to report that abuse as easily as possible, so that the online platforms and search engines can then gather that data to build up a picture and share it with the regulator and law enforcement as appropriate. So, while I suspect from what the Minister has said that he will not accept this amendment, the points that my noble friend made are absolutely necessary in this debate.
I move on to the cyberflashing amendment. It has been very ably covered already, so I do not want to say too much. It is clear that women and girls experience harms regardless of the motives of the perpetrator. I also point out that, as we have heard, motivations are very difficult to prove, meaning that prosecutions are often extremely unlikely.
I was very proud to introduce the amendments to what became the Domestic Abuse Act 2021. It was one of my first contributions in this House. I remember that, in the face of a lockdown, most of us were working virtually. But we agreed, and the Government introduced, amendments on intimate image abuse and revenge porn. Even as I proposed those amendments and they were accepted, it was clear that they were not quite right and did not go far enough. As we have heard, for the intimate image abuse proposals, the Law Commission is proposing a consent-based image abuse offence. Can my noble friend be even clearer—I am sorry that I was not able to attend the briefing—about the distinction between consent-based intimate image abuse offences and motive-based cyberflashing offences, and why the Government decided to make it?
I also gently point out to him that I know that this is complicated, but we are still waiting for drafting of the intimate image abuse offences. We are potentially running out of time. Perhaps we will see them at the next stage of the Bill—unless he reveals them like a rabbit out of a hat this afternoon, which I suspect is not the case. These are important offences and it will be important for us to see the detail so that we can scrutinise them properly.
Finally, in welcoming the Government’s amendment on coercive control, I say that it is generally poorly understood by technology companies. Overall, the use of the online world to perpetrate abuse on women and girls, particularly in the domestic abuse context, is certainly being understood more quickly, but we are all playing catch-up in how this happens while the perpetrators are running ahead of us. More can be done to recognise the ways that the online world can be used to abuse and intimidate victims, as the Government have recognised with this amendment and as the noble Baroness, Lady Gohir, said. It is very necessary in debating the Bill. I look forward to hearing the Minister’s remarks at the end of this debate.
I will certainly do so. It requires flicking through a number of amendments and cross-referencing them with provisions in the Bill. I will certainly do that in slower time and respond.
We think that the Law Commission, which looked at all these issues, including, I think, the questions put by the noble Lord, has done that well. We were satisfied with it. I thought its briefing with Professor Penney Lewis was useful in exploring those issues. We are confident that the offence as drafted is the appropriate one.
My noble friend Lady Morgan and others asked why both the Law Commission and the Government are taking a different approach in relation to intimate image abuse and to cyberflashing. We are taking action to criminalise both, but the Law Commission recommended different approaches in how to criminalise that behaviour to take into account the different actions of the perpetrator in each scenario. Sharing an intimate image of a person without their consent is ipso facto wrongful, as it is a violation of their bodily privacy and sexual autonomy. Sending a genital image is not ipso facto wrongful, as it does not always constitute a sexual intrusion, so greater additional culpability is required for that offence. To give an example, sending a photograph of a naked protestor, even without the consent of the recipient, is not always harmful. Although levels of harm resulting from behaviours may be the same and cause the same levels of stress, the criminal law must consider whether the perpetrator’s behaviour was sufficiently culpable for an offence to have been committed. That is why we think the intent approach is best for cyberflashing but have taken a different approach in relation to intimate image abuse.
I thank my noble friend for that explanation, which is very helpful and there is a lot in his reply so far that we will have to bottom out. Is he able to shed any light at all on when we might see the drafting of the intimate image abuse wording because that would be helpful in resolving some of the issues we have been debating?
I cannot give a precise date. The Committee knows the dates for this Committee are a moveable feast, but we have been having fruitful discussions on some of the issues we have already discussed—we had one yesterday with my noble friend. I appreciate the point she is making about wanting to see the drafting in good time before Report so that we can have a well thought through debate on it. I will certainly reiterate that to the usual channels and to others.
Amendment 271 additionally seeks to require companies in scope to provide systems which enable users to report incidents of cyberflashing to platforms. Clauses 16 and 26 already require companies to set up systems and processes which allow users easily to report illegal content, and this will include cyberflashing. This amendment therefore duplicates the existing requirement set out in the Bill. Amendment 271 also requires in scope companies to report cyberflashing content to the Crown Prosecution Service. The Bill does not place requirements on in scope companies to report discovery of illegal content online, other than in the instances of child exploitation and abuse, reflecting the seriousness of that crime and the less subjective nature of the content that is being reported in those scenarios.
The Bill, which has been developed in consultation with our partners in law enforcement, aims to prevent and reduce the proliferation of illegal content and activity in the first place and the resulting harm this causes to so many. While the Bill does not place any specific responsibilities on policing, our policing partners are considering how best to respond to the growing threat of online offences, as my noble friend Lady Morgan noted, in relation to the publication last week of the Strategic Threat and Risk Assessment on Violence Against Women and Girls. Policing partners will be working closely with Ofcom to explore the operational impact of the Bill and make sure it is protecting women and girls in the way we all want it to.
I hope that helps noble Lords on the issues set out in these amendments. I am grateful for the support for the government amendment in my name and hope that noble Lords will be content not to move theirs at this juncture.
(1 year, 7 months ago)
Lords ChamberI shall speak briefly to Amendments 220E and 226. On Amendment 220E, I say simply that nothing should be left to chance on IWF. No warm words or good intentions replace the requirement for its work to be seamlessly and formally integrated into the OSB regime. I put on record the extraordinary debt that every one of us owes to those who work on the front line of child sexual abuse. I know from my own work how the images linger. We should all do all that we can to support those who spend every day chasing down predators and finding and supporting victims and survivors. I very much hope that, in his response, the Minister will agree to sit down with the IWF, colleagues from Ofcom and the noble Lords who tabled the amendment and commit to finding a language that will give the IWF the reassurance it craves.
More generally, I raise the issue of why the Government did not accept the pre-legislative committee’s recommendation that the Bill provide a framework for how bodies will work together, including when and how they will share powers, take joint action and conduct joint investigations. I have a lot of sympathy with the Digital Regulation Co-operation Forum in its desire to remain an informal body, but that is quite different from the formal power to share sensitive data and undertake joint action or investigation.
If history repeats itself, enforcing the law will take many years and very likely will cost a great deal of money and require expertise that it makes no sense for Ofcom to reproduce. It seems obvious that it should have the power to co-designate efficiently and effectively. I was listening to the Minister when he set out his amendment, and he went through the process that Ofcom has, but it did not seem to quite meet the “efficiently and effectively” model. I should be interested to know why there is not more emphasis on co-regulation in general and the sharing of powers in particular.
In the spirit of the evening, I turn to Amendment 226 and make some comments before the noble Baroness, Lady Merron, has outlined the amendment, so I beg her indulgence on that. I want to support and credit the NSPCC for its work in gathering the entire child rights community behind it. Selfishly, I have my own early warning system, in the form of the 5Rights youth advisory group, made up of the GYG—gifted young generation—from Gravesend. It tells us frequently exactly what it does not like and does like about the online world. More importantly, it reveals very early on in our interactions the features or language associated with emerging harms.
Because of the lateness of the hour, I will not give your Lordships all the quotes, but capturing and reflecting children’s insight and voices is a key part of future-proofing. It allows us to anticipate new harms and, where new features pop up that are having a positive or negative impact, it is quite normal to ask the user groups how they are experiencing those features and that language themselves. That is quite normal across all consumer groups so, if this is a children’s Bill, why are children not included in this way?
In the work that I do with companies, they often ask what emerging trends we are seeing. For example, they actually say that they will accept any additions to the list of search words that can lead to self-harm content, or “What do we know about the emoji language that is happening now that was not happening last week?” I am always surprised at their surprise when we say that a particular feature is causing anxiety for children. Rather than being hostile, their response is almost always, “I have never thought about it that way before”. That is the value of consulting your consumer—in this case, children.
I acknowledge what the Minister said and I welcome the statutory consultees—the Children’s Commissioner, the Victims’ Commissioner and so on. It is a very welcome addition, but this role is narrowly focused on the codes of practice at the very start of the regulatory cycle, rather than the regulatory system as a whole. It does not include the wider experience of those organisations that deal with children in real time, such as South West Grid for Learning or the NSPCC, or the research work done by 5Rights, academics across the university sector or research partners such as Revealing Reality—ongoing, real-time information and understanding of children’s perspectives on their experience.
Likewise, super-complaints and Ofcom’s enforcement powers are what happen after harms take place. I believe that we are all united in thinking that the real objective of the exercise is to prevent harm. That means including children’s voices not only because it is their right but because, so often in my experience, they know exactly what needs to happen, if only we would listen.
My Lords, I speak mainly to support Amendment 220E, to which I have added my name. I am also delighted to support government Amendment 98A and I entirely agree with the statutory consultees listed there. I will make a brief contribution to support the noble Lord, Lord Clement-Jones, who introduced Amendment 220E. I thank the chief executive at Ofcom for the discussions that we have had on the designation and the Minister for the reply he sent me on this issue.
I have a slight feeling that we are dancing on the head of a pin a little, as we know that we have an absolutely world-leading organisation in the form of the Internet Watch Foundation. It plays an internationally respected role in tackling child sexual abuse. We should be, and I think we are, very proud to have it in the United Kingdom, and the Government want to enhance and further build on the best practice that we have seen. As we have already heard and all know, this Bill has been a very long time in coming and organisations such as the Internet Watch Foundation, which are pretty certain because of their expertise and the good work they have done already, should be designated.
However, without knowing that and without having a strong steer of support from the Minister, it becomes harder for them to operate, as they are in a vacuum. Things such as funding and partnership working become harder and harder, as well, which is what I mean by dancing on the head of a pin—unless the Minister says something about another organisation.
The IWF was founded in 1996, when 18% of the world’s known child sexual abuse material was hosted in the UK. Today that figure is less than 1% and has been since 2003, thanks to the work of the IWF’s analysts and the partnership approach the IWF takes. We should say thank you to those who are at the front line of the grimmest material imaginable and who do this to keep our internet safe.
I mentioned, in the previous group, the IWF’s research on girls. It says that it has seen more girls appearing in this type of imagery. Girls now appear in 96% of the imagery it removes from the internet, up almost 30 percentage points from a decade ago. That is another good reason why we want the internet and online to be a safe place for women and girls. As I say, any delay in establishing the role and responsibility of an expert organisation such as the IWF in working with Ofcom risks leaving a vacuum in which the risk is to children. That is really the ultimate thing; if there is a vacuum left and the IWF is not certain about its position, then what happens is that the children who are harmed most by this awful material are the ones who are not being protected. I do not think that is what anybody wants to see, however much we might argue about whether an order should be passed by Parliament or by Ofcom.