Read Bill Ministerial Extracts
Caroline Dinenage
Main Page: Caroline Dinenage (Conservative - Gosport)Department Debates - View all Caroline Dinenage's debates with the Home Office
(10 months ago)
Commons ChamberI have a similar concern about mobility scooters. Obviously, they are a fabulous tool, enabling so many in our constituencies to get out and about, but the number of serious injuries caused by mobility scooters has gone up by nearly 60% in the last 10 years, and the number of fatalities has doubled. These heavy class 3 mobility scooters, which can go up to 8 mph and travel on the roads, are not subject to insurance rules and cannot be penalised under dangerous driving regulations. Does my right hon. Friend agree that this is something the Government also need to consider very carefully? I would really love the Minister to look at whether there is any legislation that would be implementable in cases such as these.
My hon. Friend is right, and I hope the Government will respond to that. However, she will forgive me if I focus on the essence of new clause 5, which is e-bikes.
The definition of a legal e-bike is one that uses pedals and also uses electricity to assist the cyclist. All the other ones are illegal. This brings me to the problem that, if this measure is going to go through into law, as it will, will the Government press the police to start arresting and prosecuting not only the people who deliberately use e-bikes for nefarious purposes but more importantly, those who just cycle dangerously on footpaths? E-bikes are now more dangerous than bicycles in the sense that they are e-bicycles and therefore get up to higher speeds. Even though the speeds are supposed to be governed, they are still higher than most cyclists will get up to in the normal act of pedalling their way to work.
I rise to speak to new clause 47 in my name. This is a very simple new clause, in a way, about how we stop mobile phones that have been stolen from being reconnected to the cloud and sold on. If we can break that link, we can stop the proliferation of mobile phone theft, which has increased by 150%.
Some 200 mobile phones are snatched every single day, and there has been a marked increase in Westminster. I know that a number of MPs have had their mobile phones stolen—some of them are sat not too far away from me. The amount of money in this crime is incredible. I do not believe phone manufacturers are that keen to stop this crime, because I feel it is part of their business model: when somebody has their mobile phone stolen, they go and buy another mobile phone.
New clause 47 says that once somebody’s phone has been stolen and they report it to the police, the police must report it to Apple, Google, Samsung or whoever, which then stops that phone from being reconnected to the cloud. In effect, that phone would become inactive. If the manufacturer failed to do that within 48 hours, it would be fined £10,000. We need to ensure that the manufacturers take this issue seriously, because they are not. Here is the simple thing: if we want to stop mobile phones being stolen to order, we need to ensure that the manufacturers take the issue seriously. We need to ensure that IMEI numbers are easily accessible, and we need to ensure that thieves cannot reconnect the mobile phones.
I rise to speak to new clause 121, which is tabled in my name and supported by my hon. Friend the Member for Rutland and Stamford (Alicia Kearns) and, I am very pleased to say, by Members from both sides of this Chamber. It would extend the definition of extreme pornography to include depictions of non-fatal strangulation, known as NFS.
NFS was made a criminal offence in 2021 under the last Government, not because we think the Government should necessarily stick their nose into what people want to get up to in the bedroom, but because abusers use non-fatal strangulation without consent, as it leaves little visible injury and makes it hard to prosecute under domestic abuse cases. When a woman dies from strangulation, it is becoming increasingly common to use the defence that it was a sex game gone wrong.
Non-fatal strangulation has a life out there in the world of online porn. As we know, the UK is a large porn consumer. In any given month, more than 10 million adults in the UK will access online porn, and the vast majority of them will be chaps. That is up to them—we do not judge—but we know from research that online porn is so widespread that one in 10 children have seen it by the age of nine. Unfortunately, it is the guide that many young people use to learn about sex.
That is why I am extremely worried that non-fatal strangulation has been found to be rife on porn sites. Evidence has shown that it is directly influencing the sexual behaviour of young men, who are non-consensually strangling young women during consensual sex. Recent polling has suggested that 17% of 16 to 34-year-olds have been strangled without giving consent during consensual sex.
We are not being prudes in calling for this misogynistic act to be banned in online porn. Health experts warn that there is no way to strangle someone without risk, given that blood and airflow may both be restricted. A person can become unconscious within 10 seconds of being choked, and within 17 seconds they can have a seizure due to lack of oxygen. Death can occur within 150 seconds of being rendered unconscious.
Almost 20% of the women killed in the UK since 2014 were strangled by an intimate partner. Perpetrators who choke their partners are seven times more likely to kill them. I am sure the Minister will agree that it is alarming to hear reports of young men and boys seeking advice on how they can safely strangle their partner in bed and that girls are expected to accept that kind of behaviour. There was even a report last year, which the Minister may have heard about, of draft personal, social, health and economic education guidance from a Welsh local authority including safe choking during sex for a child sex education class. We need to send a signal that strangling your partner in bed is not safe—it can be a precursor to coercive, abusive behaviour. I know that the Government also want to send that signal, because in February they said, in their response to an independent review commissioned by the previous Government:
“The government will take urgent action to ensure pornography platforms, law enforcement and prosecutors are taking all necessary steps to tackle this increasingly prevalent harm.”
I therefore urge the Minister to support my new clause 121, which sets out one of the necessary steps referred to in the Government’s response. We need to back this amendment, ban this harmful practice, and send out a very strong message that depictions of non-fatal strangulation in porn normalise something that is not normal and is not safe.
Joe Powell (Kensington and Bayswater) (Lab)
I rise to speak to new clause 155, which stands in the name of my hon. Friend the Member for Bolton West (Phil Brickell) and is supported by the all-party parliamentary group on anti-corruption and responsible tax. I welcome the Bill for its clear and ambitious strategy to tackle antisocial behaviour and crime, but if we want truly safer streets, we must also step up our efforts to tackle financial and economic crime. That is the aim of our amendment, which is supported by at least 30 Members from across the House.
Caroline Dinenage
Main Page: Caroline Dinenage (Conservative - Gosport)Department Debates - View all Caroline Dinenage's debates with the Home Office
(4 days, 13 hours ago)
Commons ChamberI am very grateful to the Minister for giving way on that point. I am not sure whether she will come on to this, but the Government have tabled amendments on online safety, and have identified that the next frontline in this war is artificial intelligence. As she knows, we have already seen children taking their own lives after interactions with AI chatbots, and we know that tech companies will always prioritise profits over user safety, so there must be more focus on a safety-by-design approach that prevents AI products that could be harmful to users from coming to market. This approach has been suggested by Baroness Kidron in the other place. Why are the Government not supporting her amendment?
I thank the hon. Lady for her intervention. She is, of course, right about the growing concern around chatbots and the need for safety by design. I will come on to Baroness Kidron’s amendment and the Government’s response to it later on in my speech.
Furthermore, the Government have brought forward Lords amendment 367 to take a power to extend the scope of the Online Safety Act 2023 to cover unregulated AI chatbots. It means that general-purpose AI chatbots, such as Grok, which allow the creation and sharing of non-consensual intimate images, will have to proactively remove that illegal content from their services or face enforcement from Ofcom. Taken together, the measures will deliver an effective ban on nudification tools. Given that, we do not believe that a separate possession offence, as provided for in Lords amendment 505, would make a meaningful difference, not least as many such tools are accessed online, rather than possessed.
Where a person is convicted of an intimate image offence, we agree that it is vital that those images are deleted from the perpetrator’s devices. Amendment (a) in lieu of Lords amendment 258 enables the courts to make an image deletion order following a conviction for an offence related to intimate image abuse. Breach of the order will be a criminal offence. The amendment also enables the courts to require the deletion of other intimate images of the same victim. This approach gives courts the required flexibility to consider the details of each case when applying their powers, while ensuring that the offenders are held accountable for compliance with the order.
I appreciate the challenge that the right hon. Gentleman is raising, and I know that DUP Members of Parliament in particular have raised these concerns before. The challenge here is that Lords amendment 357 would remove the historical safeguard for statements that glorify acts of terrorism committed by proscribed organisations. Our view is that these statements may not necessarily create terrorist risk and may result in the offence capturing legitimate political and social discourse and debate.
I will say two other things to the right hon. Gentleman. First, the independent reviewer of terrorism legislation, Jonathan Hall KC, strongly advised against the removal of the historical safeguard in his review of terrorism legislation following the 7 October attacks in 2023. Secondly, in the light of the concerns that have been raised in the Lords and by Members in this place, the Government will ask the independent reviewer to conduct a more detailed review of the encouragement offence within six months of Royal Assent.
Let me turn to Lords amendment 359. It is a long-standing principle that has been adopted by successive Administrations that the Government do not comment on which organisations are being considered for proscription. Mandating that the Government review whether to proscribe Iranian Government-related organisations would violate this principle and tie the Government’s hands unnecessarily. The Government are already taking decisive action to deter threats from Iran, and we have committed to introducing a new state threats-based proscription tool.
I turn now to Lords amendments 360 and 368 to 372 tabled by Baroness Kidron, which concern chatbots. The Government are clear that we need to act quickly to bring all unregulated AI chatbots within the scope of the Online Safety Act’s requirements on illegal activity. As I mentioned earlier, the Government are seeking to take a regulation-making power to do this, under Lords amendment 367. By taking this power, the Government will be able to remove any ambiguity over whether services like Grok are subject to the Online Safety Act’s provisions to tackle illegal content. This approach also allows us to design regulations that are effective, targeted and informed by necessary consultation with subject matter experts. Amendment (a) in lieu of Lords amendment 372 commits the Government to reporting to Parliament by the end of the year on our progress to develop regulations.
I don’t mean to bang on about this, but the fact is that the Government’s approach is too narrow. It is focused on taking down illegal content when it should be the responsibility of the company to prevent harms in the first place, rather than to deal with them after the event. We do not design any other sector’s regulation in this way. When designing aircraft, we do not wait until after the plane has crashed to worry about any of the safety features. This should be the same.
During Report stage in the Lords, peers voted overwhelmingly in support of the safety-by-design approach. They also understood that when it comes to the design of something, harm includes building in aspects that are addictive and manipulative, which have been key to some of the very tragic suicides of children who have interacted with AI chatbots. What do the Government have against building safety by design into the very purpose of AI chatbots?
The hon. Lady makes her case very clearly, and we can agree that we need to design out those kinds of issues. The challenges are in what we do and how we do it—those are the challenges we had with this particular group of amendments. Obviously there is wider work being done on violence against women and girls and how the Online Safety Act is to be rolled forward, and that work is really important, but we are talking about this particular group of Lords amendments on chatbots and the challenges with them. That is why, through amendment (a) in lieu, we commit to reporting by the end of the year on our progress to develop regulations.
We are clear that regulation is a more effective and proportionate tool than the criminal law for addressing risks from AI chatbots and setting industry best practice. Incorporating currently unregulated chatbots into the scope of the Online Safety Act will ensure that such regulation applies extraterritorially, which is crucial when dealing with international companies.
The Government’s approach is also broader in scope than the content of amendments 360 and 368 to 372. Those amendments would not capture image generators creating non-consensual graphic images of women or online AI chatbot toys such as Gabbo. The Government’s amendment in lieu does capture such services and allows them to be clearly brought under online safety regulations.