(1 day, 4 hours ago)
Lords ChamberMy Lords, I will speak to Amendment 14; I am very grateful to the noble Lord, Lord Polak, and noble Baroness, Lady Kidron, for adding their names to it. It seeks to clarify the definition of a “crime of violence” in the criminal injuries compensation scheme when it refers to the abuse of a child that has happened online. I hope there will be a response to my arguments not dissimilar to the Minister’s response to Amendment 2—namely, that there appears to be a gap that is worthy of being looked into properly and systematically.
Survivors of technology-facilitated child sexual abuse—I am afraid that it has an acronym, TCSA—and other coercive online sexual offences may be refused compensation under the criminal injuries compensation scheme on the basis that the injury did not result from a “crime of violence”, despite the seriousness of the abuse and the criminal offences involved. Amendment 14 seeks to clarify that cases that involve “coercion”, “domination” or “compelled” sexual acts fall within the scope of the scheme.
I am afraid that it will probably not surprise your Lordships that the scale of online child sex abuse is going up dramatically. Over 7,000 offences of sexual communication with a child were recorded in 2023-24, and 122,768 child sexual abuse and exploitation offences were recorded in 2024, of which almost half—42%—had an online element. The criminal injuries compensation scheme obviously cannot accept all the applications made to it for support. Year on year, it has increasingly not been allowing some of the applications that are made. Nevertheless, the number of applications for support involving sexual abuse and sexual assault is going up even more quickly. Some 1,601 applicants who reported sexual assault were refused compensation in 2024-25. The number of refusals under this threshold has increased by just over one-quarter in two years. However, the scheme does not record detailed offence categories, so we do not know exactly how many of those referred to child sexual abuse situations.
To illustrate, I will briefly give an example of exactly what this involves. We are working with a Northern Ireland-based charity called the Marie Collins Foundation, which is particularly focused on trying to help victims of these offences. The foundation recently supported a child who was subjected to sustained online sexual coercion by an adult offender who used manipulation and threats to compel the child to perform sexual acts via digital communication. Over time, the offender established control through grooming, emotional manipulation and threats to expose the child if they did not comply with further sexual demands. The abuse caused significant psychological harm, including anxiety, shame and trauma, consistent with other forms of child sexual abuse.
When the victim applied to the criminal injuries compensation scheme, the claim was initially refused on the basis that the injury did not arise from a crime of violence. But the decision was subsequently overturned on appeal, recognising the seriousness of the abuse and the harm it caused. The case illustrates the uncertainty in how coercive online sexual abuse of children is interpreted within the scheme and the additional burden it places on victims, who have to pursue appeals to the scheme to try to get their case heard.
Amendment 14 seeks to provide clarity by confirming that cases that involve coercion, domination or compelled sexual acts, including those facilitated online, fall within the scope of the scheme. The amendment seeks to provide clarity rather than an expansion of the scheme. It would simply ensure that cases involving coercion, domination or compelled sexual acts, including those facilitated online, are recognised as crimes of violence for the purposes of compensation. This would help the survivors of serious sexual abuse and ensure they are not excluded due to uncertainty over the interpretation of the scheme.
I hope that we do not have any children in the Public Gallery at the moment. I will just briefly describe what some of this involves online. I have already mentioned blackmail, coercion, threats, domination, and emotional and psychological abuse. There is the creation and sharing of sexual images, livestreamed sexual activity and other sexual acts, fear, loss of autonomy, erosion of agency and long-term psychological harm. The children are sometimes asked or invited to insert various objects into parts of their body. Some of the things that happen are simply unspeakable. The purpose of the amendment is to draw this to the attention of the Government and to ask that this be looked at carefully and seriously, not least because, as we know, in so many cases happening in the online world, the volume and types of abuse are increasing exponentially.
My Lords, I added my name to Amendment 14, alongside that of my noble friend Lord Russell, and he has adequately explained the gap.
I started, unfortunately, looking at child sexual abuse in 2012. Unfortunately, in the period since then, I have had the misfortune to look at a great deal of child sexual abuse and I say that it is an act of violence against the person in the image.
While the noble Lord, Lord Russell, was speaking, I remembered one of the very first experiences I had. I filmed an interview with a young girl at the moment she realised that the person online, who she thought was her lover, was indeed a groomer. In the next moment, she realised that she had been recorded, and in the next moment, she realised that the recording had been shared. In those moments, I watched a heartbreak, faith-break and trust-break. That young child tried to commit suicide twice in the following summer. We were able to get her help and, thankfully, she is now a survivor and not a victim. I am standing up only to stay that what happens online does not stay online. What happens online is violence. What happens to children online must not be ignored by the law.
(1 week, 2 days ago)
Lords ChamberMy Lords, briefly, I support the amendments in the name of the noble Baroness, Lady Bertin. It has been a very grim afternoon, I have to say, repeatedly hearing some of the most horrendous things that can happen to women and children. I say to the Minister, for whom I have a great deal of respect and who spoke passionately—a word normally associated with me—that this is still too little, too late and too long across a number of these issues. I know that the noble Baroness, Lady Levitt, is relatively new in the House, but we have been debating these things for eight years and I remember having this exact discussion during the Online Safety Bill. We have to just move on. We cannot keep on saying that it moves quickly and then allowing ourselves to move this slowly.
The noble Baroness, Lady Bertin, made a really strong case that online porn affects real life. It is real-life violence and there is this unbelievably vast overlap with child sexual abuse. It is that mess that we have to see as one and, in that sense, the noble Baroness made the case for all of her amendments. I want to quickly mention government Amendment 272, which establishes an offence if a person makes or adapts, or
“supplies or offers to supply a thing, for use as a generator of … intimate images”.
What has happened to that amendment is exactly the same as what happened to the child sexual abuse amendment that has the same form. It deals with intentionality and says: “If you absolutely intend to do this, it will be illegal. But if it happens in general, on any old piece of software that somebody hasn’t bothered to train properly or put protections in, then you’re not caught”. I believe that is what the noble Baroness has in her broader amendment about software.
I really want to make the point that there seems to be a reluctance to catch general- purpose technology in these issues of child abuse, violence against women, intimate image abuse and pornography, and I hope that the Government are listening. We cannot avoid general-purpose technology if that is what is spreading, creating and making this situation available across communities. It is in that space that so many children first see porn. It is in that space that so many women are abused and that so much child sexual abuse is present.
I urge the Minister to think about the breadth and not just the intentionality, because in my view it does not really matter whether it is accidental on the part of the company. I finish by saying that I had the privilege of meeting Yoshua Bengio last week, who is absolutely central to the development of AI and neural networks, and so on. He said, and I paraphrase: show me the incentive and I will show you the design.
My Lords, I rise very briefly, partly as a male of the species, since we are largely responsible for the situation we are describing. We are behind these business models, we are the sex that is making all the money out of it, and, in most cases, we are the abusers. It behoves us to acknowledge that and speak up about it.
I pay tribute to the noble Baroness, Lady Bertin. As a mother of young children, she has, on our behalf, subsumed herself for over two years in a world that most of us can barely imagine. That must have been an extraordinarily unpleasant and difficult experience. I pay tribute to her for doing it, because I am not sure many of us would have taken that on or lasted the course.
With that in mind, given the time and thought that she has given to this, the number of experts she has spoken to, the number of international parameters and comparators she has taken into account in looking at this, and the detailed way in which she has analysed the business models that underline this highly profitable business, it behoves all of us, and particularly the Government, to listen very carefully. The amendments that she has brought forth are not something that she dreamed up overnight; they are based on her detailed and painful knowledge of exactly how this business operates. She is identifying some gaps in the laudable approach the Government are taking to try to do something about this.
With my business experience hat on, I say that a major fault that businesses make is overpromising and underdelivering. His Majesty’s Government are in grave danger of doing exactly that in many of these areas to do with violence against women and girls. It is wonderful to have the headlines and to say, “We are taking this seriously and we are doing something about it”, but the devil is in the detail, and the detail is effective implementation. To effectively implement, you have to understand the business model, and, as people have said previously, you have to be prepared to disrupt it.
(2 years, 8 months ago)
Lords ChamberMy Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.
On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.
Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.
Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?
The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.
One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:
“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.
Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.
We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?
My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.
Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.
Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.