Baroness Morgan of Cotes
Main Page: Baroness Morgan of Cotes (Non-affiliated - Life peer)Department Debates - View all Baroness Morgan of Cotes's debates with the Home Office
(1 month, 2 weeks ago)
Lords ChamberMy Lords, I put my name to Amendments 479 and 480, and I support the other amendments in this group. I have once again to thank my noble friend Lady Kidron for raising an issue which I had missed and which, I fear, the regulator might have missed as well. After extensive research, I too am very worried about the Online Safety Act, which many of your Lordships spent many hours refining. It does not cover some of the new developments in the digital world, especially personalised AI chatbots. They are hugely popular with children under 18; 31% use Snapchat’s My AI and 32% use Google’s Gemini.
The Online Safety Act Network set up an account on ChatGPT-5 using a 13 year-old persona. Within two minutes, the chatbot was engaged with the user about mental health, eating disorders and advice about how to safely cut yourself. Within 40 minutes, it had generated a list of pills for overdosing. The OSA was intended to stop such online behaviour. Your Lordships worked so hard to ensure that the OSA covered search and user-to-user functions in the digital space, but AI chatbots have varied functionalities that, as my noble friend pointed out, are not clearly covered by the legislation.
My noble friend Lady Kidron pointed out that, although Dame Melanie Dawes confirmed to the Communications and Digital Committee that chatbots are covered by the OSA, Ofcom in its paper Era of Answer Engines admits:
“Under the OSA, a search service means a service that is, or which includes, a search engine, and this applies to some (though not all) GenAI search tools”.
There is doubt about whether the AI interpretive process, which can change the original search findings, excludes it from being in the scope of search under the OSA. More significantly, AI chatbots are not covered where the provider creates content that is personalised for one user and cannot be forwarded to another user. I am advised that this is not a user-to-user service as defined under the Act.
One chatbot that seems to fall under this category is Replika. I had never heard of it until I started my research for this amendment. However, 2% of all children aged nine to 17 say that they have used the chatbot, and 18% have heard of it. Its aim is to stimulate human interaction by creating a replica chatbot personal to each user. It is very sophisticated in its output, using avatars to create images of a human interlocutor on screen and a speaking voice to reply conversationally to requests. The concern is that, unlike traditional search engines, it is programmed for sycophancy, or, in other words, to affirm and engage the user’s response—the more positive the response, the more engaged the child user. This has led to conversations with the AI companion talking the child user into self-harm and even suicide ideation.
Research by Internet Matters found that a third of children users think that interacting with chatbots is like talking to a friend. Most concerning is the level of trust they generate in children, with two in five saying that they have no concerns about the advice they are getting. However, because the replies are supposed to be positive, what might have started as trustworthy advice develops into unsafe advice as the conversation continues. My concern is that chatbots are not only affirming the echo chambers that we have seen developing for over a decade as a result of social media polarisation but are reducing yet further children’s critical faculties. We cannot leave the development of critical faculties to the already inadequate media literacy campaigns that Ofcom is developing. The Government need to discourage sycophancy and a lack of critical thinking at its digital source.
A driving force behind the Online Safety Act was the realisation that tech developers were prioritising user engagement over user safety. Once again, we find new AI products that are based on the same harmful principles. In looking at the Government’s headlong rush to surrender to tech companies in the name of AI growth, I ask your Lordships to read the strategic vision for AI laid out in the AI Opportunities Action Plan. It focuses on accelerating innovation but fails to mention once any concern about children’s safety. Your Lordships have fought hard to make children’s safety a priority online in legislation. Once again, I ask for these amendments to be scrutinised by Ofcom and the Government to ensure that children’s safety is at the very centre of their thinking as AI develops.
My Lords, I support the amendments of the noble Baroness, Lady Kidron. I was pleased to add my name to Amendments 266, 479 and 480. I also support the amendment proposed by the noble Lord, Lord Nash.
I do not want to repeat the points that were made—the noble Baroness ably set out the reasons why her amendments are very much needed—so I will make a couple of general points. As she demonstrated, what happens online has what I would call real-world consequences—although I was reminded this week by somebody much younger than me that of course, for the younger generation, there is no distinction between online and offline; it is all one world. For those of us who are older, it is worth remembering that, as the noble Baroness set out, what happens online has real-world, and sadly often fatal, consequences. We should not lose sight of that.
We have already heard many references to the Online Safety Act, which is inevitable. We all knew, even as we were debating the Bill before it was enacted, that there would have to be an Online Safety Act II, and no doubt other versions as well. As we have heard, technology is changing at an enormously fast rate, turbocharged by artificial intelligence. The Government recognise that in Clause 63. But surely the lesson from the past decade or more is that, although technology can be used for good, it can also be used to create and disseminate deeply harmful content. That is why the arguments around safety by design are absolutely critical, yet they have been lacking in some of the regulation and enforcement that we have seen. I very much hope that the Minister will be able to give the clarification that the noble Baroness asked for on the status of LLMs and chatbots under the Online Safety Act, although he may not be able to do so today.
I will make some general points. First, I do not think the Minister was involved in the debate on and scrutiny of—particularly in this Chamber—what became the Online Safety Act. As I have said before, it was a master class in what cross-party, cross-House working can achieve, in an area where, basically, we all want to get to the same point: the safety of children and vulnerable people. I hope that the Ministers and officials listening to and involved in this will work with this House, and with Members such as the noble Baroness who have huge experience, to improve the Bill, and no doubt lay down changes in the next piece of legislation and the one after that. We will always be chasing after developments in technology unless we are able to get that safety-by-design and preventive approach.
During the passage of the then Online Safety Bill, a number of Members of both Houses, working with experienced and knowledgeable outside bodies, spotted the harms and loopholes of the future. No one has all the answers, which is why it is worth working together to try to deal with the problems caused by new and developing technology. I urge the Government not to play belated catch-up as we did with internet regulation, platform regulation, search-engine regulation and more generally with the Online Safety Act. If we can work together to spot the dangers, whether from chatbots, LLMs, CSAM-generated content or deepfakes, we will do an enormous service to young people, both in this country and globally.
My Lords, I support Amendments 479 and 480, which seek to prevent chatbots producing illegal content. I also support the other amendments in this group. AI chatbots are already producing harmful, manipulative and often racist content. They have no age protections and no warnings or information about the sources being used to generate the replies. Nor is there a requirement to ensure that AI does not produce illegal content. We know that chatbots draw their information from a wide range of sources that are often unreliable and open to manipulation, including blogs, open-edit sites such as Wikipedia, and messaging boards, and as a result they often produce significant misinformation and disinformation.
I will focus on one particular area. As we have heard in the contributions so far, we know that some platforms generate racist content. Looking specifically at antisemitism, we can see Holocaust denial, praise of Hitler and deeply damaging inaccuracies about Jewish history. We see Grok, the X platform, generating numerous antisemitic comments, denying the scale of the Holocaust, praising Adolf Hitler and, as recently as a couple of months ago, using Jewish-sounding surnames in the context of hate speech.
Impressionable children and young people, who may not know how to check the validity of the information they are presented with, can so easily be manipulated when exposed to such content. This is particularly concerning when we know that children as young as three are using some of these technologies. We have already heard about how chatbots in particular are designed in this emotionally manipulative way, in order to boost engagement. As we have heard—it is important to reiterate it—they are sycophantic, affirming and built to actively flatter.
If you want your AI chatbot or platform not to flatter you, you have to specifically go to the personalisation page, as I have done, and be very clear that you want responses that focus on substance over praise, and that it should skip compliments. Otherwise, these platforms are designed to act completely the other way. If a person acted like this in some circumstances, we would call it emotional abuse. These design choices mean that young people—teens and children—can become overly trusting and, as we have heard in the cases outlined, reliant on these bots. In the most devastating cases, we know that this focus on flattery has led to people such as Sophie Rottenberg and 16 year-old Adam Raine in America taking their own lives on the advice of these AI platforms. Assisting suicide is illegal, and we need to ensure that this illegality extends to chatbots.
Baroness Morgan of Cotes
Main Page: Baroness Morgan of Cotes (Non-affiliated - Life peer)Department Debates - View all Baroness Morgan of Cotes's debates with the Home Office
(1 week, 2 days ago)
Lords ChamberMy Lords, I had hoped, as the Minister knows, that we might have reached this amendment last month, in the same week the Government published their long-awaited violence against women and girls strategy—which would have been appropriate—but I left him to debate another extremely important issue. It is a pleasure to open the proceedings on the Bill in 2026 with this amendment.
I am sure all noble Lords support the Government’s ambition of halving violence against women and girls. The challenge with any such strategy is of course in its delivery. Securing safer public spaces for women and girls is essential, and safer streets was of course a key demand, and continues to be, following the terrible murder of Sarah Everard. But there are of course many places where women and girls feel unsafe, and that includes trains and public transport.
I noted this paragraph, on page 65 of the Government’s December strategy, which is headed “Every corner of public life will be safe”:
“Women and girls must both feel safe and be safe in every aspect of public life. … Safety is not just about reducing risk, it is about creating environments that foster confidence, dignity, and freedom of movement. Design and planning are critical tools in achieving this. Well-lit streets, accessible transport, and thoughtful urban design can deter violence, reduce opportunities for harm, and send a clear message that public spaces belong to everyone. By embedding considerations of VAWG into planning and transport guidance, we can ensure that safety is built into the fabric of our communities, making public spaces welcoming and secure for all. To support this, we”,
the Government,
“will update national design guidance to reflect a VAWG perspective, ensuring that safety considerations inform how public spaces are designed”.
Turning to this amendment, I think that the Committee should be aware that, since 2021, there has been an alarming rise in violence against women and girls on our railways—it is up 59%. Sexual offences specifically have risen by 10% and harassment is up 6%. To put this in actual numbers, in 2022-23, there were 2,475 sexual offences; that was up from 2,246 the year before. In 2021, 7,561 crimes against women and girls on railways were recorded by British Transport Police; that had risen by 2023-24 to 11,357.
It is therefore no surprise that these crimes are now classed as a national emergency by the National Police Chiefs’ Council. Due to this, nearly two thirds—63%—of women say they avoid travelling alone, and even women who continue to use public transport often undertake what is called “normalised behaviour”, like being very choosy as to where they sit or assiduously avoiding making eye contact with any fellow passengers.
Of course, numbers tell only half the story. For each survivor of an offence or an attempted offence, their experience stays with them, as we heard just in the last couple of weeks in the powerful testimony given by Her Majesty the Queen. But there are of course many others who have bravely shared their experiences of vulnerability in a place that they should not feel vulnerable at all. It is clear from the numbers I have just given to the Committee that action is needed to ensure measures can be put in place to reduce this level of crime against women and girls on our national rail network, and the Government need to take a lead on this.
This is, of course, a probing amendment. The wording in subsection (1) would place a clear duty:
“The British Transport Police must take all reasonable steps to prevent violence against women and girls on trains”.
Subsection (2) sets out what such abuse could entail but is not limited to those offences. Subsection (3) sets out what “reasonable steps” must include. The reason that this is a probing amendment is that I suspect that the Minister will tell me shortly that this is not the right Bill for such an amendment, so I want to take this opportunity to say that, while I might have some limited sympathy for his argument—
He is looking slightly surprised, so perhaps I have pre-empted his argument or that is not the argument that he is going to make, in which case I will be delighted. But if it is, Ministers will not be able to use the same argument in the forthcoming Railways Bill, where the Government will be accepting a clear responsibility for what happens on trains operating as part of their newly nationalised services.
The reason for subsection (3) is that enforcement after the event for perpetrators is not sufficient if the Government are to stand any chance of cutting violence against women and girls by 50%. Prevention is key to achieving anything like that goal. The suggestion in subsection (3) about what could constitute “reasonable steps” is vital if we are to move to a preventative and safety-by-design model. A crucial first step would be, as the amendment suggests, the sharing of data about cases and levels of violence against women and girls between the British Transport Police and the rolling stock companies. Of course, this is not just about violence against and women and girls in relation to passengers but is highly relevant to female staff operating on the rail network.
Following Royal Assent of the Passenger Railway Services (Public Ownership) Bill 2024, the Department for Transport instructed DfT Operator to assume responsibility for train operators’ ownership in England and provide
“safe, secure and sustainable transport”.
However, since then, there has been no clarification as to how this will occur. These amendments provide a way in which there can be a review of safety issues and standards on trains.
Better and more synchronised technology, subject to government standards and fitted at the point that a train is manufactured, would truly create that safe, secure and sustainable transport. It would also ensure that the Government could have true oversight of this issue and that all modern technology and innovation used by rail operating companies to help drive confidence in passengers, especially women, that are used by manufacturers subject to a gold standard. In addition, as long as manufacturers have the option not to include extra specifications that cost them money but do not seem to bring them monetary benefit, and merely bring societal benefit, they are less likely to install such measures, especially in the current economic environment, where every penny will count.
It is worth remembering that the previous Government created the secure stations scheme, which emphasised collaborative working and station design that deters crime and aids the safeguarding of vulnerable individuals. Using advanced technologies created by innovative companies that provide rolling stock with custom-made parts and technology, these amendments would allow the extension of this scheme to further improve passenger confidence. Just including more CCTV is insufficient. Design features such as improved lighting deployed by companies such as Belvoir Rail are very relevant here.
This amendment is an early opportunity for the Government to show that they are ready to stand behind their December violence against women and girls strategy. It would also demonstrate that delivery of the strategy is a priority across all government departments and is not just being left to the Home Office. I beg to move Amendment 356A.
My Lords, it is a pleasure to follow the noble Baroness, Lady Morgan of Cotes. My proposed Amendment 356F is complementary, in a sense, to hers. My amendment would create a specific offence of assaulting a transport worker at work. It would be an equivalent protection to that given to retail workers by Clause 37 of the Bill, and there is of course existing legislation protecting emergency workers. I confess to a certain unease in proposing specific offences for specific groups of workers, but in the case of transport workers there are particular circumstances which justify an offence to protect them.
There has been a marked increase in violence against transport workers. Of course, the situation was highlighted by the multiple stabbings at Huntington on 1 November 2025. But violent offences against rail staff increased by 35% in 2024, according to the British Transport Police Authority. The overall increase in both incidents and the severity of violence against transport workers is to be noted. Of course, it is not just railway workers; transport workers protected would include those on the Underground, the Metro, trams, ferries and buses, and all other transport workers.
My noble friend mentioned his noble kinsman, my noble friend Lord Hendy of Richmond Hill, who is the Transport Minister. The British Transport Police are the responsibility of and answer to the Department for Transport. My other noble friend Lord Hendy is the Minister responsible for transport. If I may, I will refer that request to the Minister directly responsible for that policy in this Bill, so that they can consider what my noble friend has just said.
There is a distinction between the existing legislation that I have mentioned, which provides security against attack for public-facing workers, and the Clause 37 issue, which we have already debated. We may undoubtedly return to this on Report in several forms but, in the meantime, I would be grateful if the noble Baroness would withdraw her amendment.
I thank all noble Lords who have taken part in this short debate. It is one of those that shows the Chamber at its finest, when there is a genuine discussion of some important issues. This was a deliberately narrow amendment, but I welcome the comments that have been made across the Committee on how it could be widened. I particularly welcome that of my noble friend Lady McIntosh about public spaces more broadly, but also the suggestion relating to other forms of public transport, especially trams. I expect that we could apply this to the Underground, not just in London but in other cities too.
I welcome the comments from the noble Lord, Lord Blencathra, about behaviour on trains. The list of offences in proposed new subsection (2) is not exhaustive, and I fully take his point. There is an irony to debating this amendment at a rather more civilised time of the day than we might otherwise have done, had we reached it in December. One reason why I wanted to know whether we were going to reach the debate was that, because we sat late previously, I had to get a 10.30 pm train home to Leicestershire. I would describe myself as being rather robust, but I do not want to travel at half past 10 at night and get home to a deserted car park at nearly midnight. I do not think that anybody wants to do that, nor should we ask members of the House staff to do so. However, I will leave that debate about sitting hours for a very different set of noble Lords to consider.
I thank the Minister for his very helpful and constructive comments on my amendment. The Committee has identified that this is an issue about prevention of violence against women and girls, not just enforcement after the event. He rightly took the point that it is not just about British Transport Police but about working with the train operating companies, as he mentioned. I would very much like to take up his offer of a meeting, whether with Department for Transport officials or with the Rail Safety and Standards Board; he mentioned its forthcoming consultation. I think that we will return to this issue in the Railways Bill, so he can let the other noble Lord, Lord Hendy, know to expect such a debate. For now, I beg leave to withdraw the amendment.
My Lords, I added my name to the amendment moved by the noble Lord, Lord Vaux. Like him, I served on the Select Committee on fraud, ably chaired by my noble friend Lady Morgan of Cotes, that produced a very substantial document indeed. After we produced our report, the Government published a consultation document headed Preventing the Use of SIM Farms for Fraud. In December 2023, the Government published their response to that document. I want to quote briefly from three paragraphs of that response.
Referring to the responses they got, the Government said:
“A few responses noted that banning physical SIM farms alone is likely to result in displacement to eSIM farms”,
which is the point that has just been made. They went on:
“However they acknowledged that if eSIMs were included to the proposed ban, the Government’s definition of SIM farms should be adapted to ensure it excludes smartphones that can hold more than four eSIMs”.
The Government’s response to that section was:
“Responses noted that the definition could also include eSIMs and mobile apps. However, we did not receive sufficient evidence at consultation to include them in a proposed ban, due to their complexity and ongoing pace of development. This could be further addressed by the proposed powers to extend the ban to other forms of telecommunications equipment and articles used to perpetrate fraud”.
They referred to a further final paragraph headed “Government response”:
“The Government considers it important to ensure that the ban is flexible and can be used to rapidly prohibit other types of technology where these are identified in the future. Some such technologies are mentioned above, whilst others may emerge in future and the Government will continue to review fraud methodologies closely for changing patterns and new technologies being used, such as eSIM farms and others. However, the Government agrees with respondents that any powers to ban through secondary legislation ought to have clear parameters for their use”.
That was the last Administration, of course, and it would be helpful to know whether the Government agree with that line.
The question I want to ask the Minister is this. Referring to the clauses on SIM farms, Clause 114(4) says:
“The Secretary of State may by regulations amend this section (other than this subsection)”.
Is that in effect giving the Secretary of State powers to introduce by secondary legislation something that the previous Government said should not be done by secondary legislation? I leave that question hanging in the air while the Government seek advice from the Bench to see what the answer is.
My Lords, I will briefly speak to Amendment 358. It is a pleasure to follow the noble Lords, Lord Vaux and Lord Young of Cookham. Because we are going to be discussing this and a later amendment on fraud, I declare my interest as a director of Santander UK.
It was a huge pleasure and privilege to chair the Lords inquiry into online and digital fraud, which reported in 2022, and I would like to think that we had some impact in raising the issues, which are of huge importance to the public. Fraud is one of the crimes that people are most likely to be victims of. I know the Minister knows that because he is the Anti-Fraud Minister in the department.
Noble Lords have already spoken about the importance of this amendment, the need for the law to be kept up to date as the technology develops, and the fact that allowing as much flexibility in legislation as possible to enable that to happen is important. The reason we talked about the “fraud chain” in the report is that, obviously, people encounter fraud in myriad ways. Fraudsters are, as we have heard, incredibly flexible, and entrepreneurial—for all the wrong reasons. Of course, telecoms—people’s smartphones or phones—is where many people will first encounter the fraudster, who will then try, as we heard in our evidence, to get them away from technology and strike up some kind of relationship which unfortunately ends in people often losing life-changing amounts of money.
I do not want to pre-empt the debate on Amendment 367, which I hope we will also reach today, but the question, perhaps now or for later, is whether the Minister is confident that the previous Government’s and this current Government’s ask of the telecoms industry is strong enough given the frequency with which the public encounter fraud via their telephones. I will ask the question now, but I am sure we will come back to it. We are all waiting for the forthcoming fraud strategy from the Government, which we understand is—I hope—close. Can the Minister give us a little precursor of whether that will impose tougher asks and potential penalties on the telecoms companies for the reasons that we have already heard?
My Lords, we strongly support Amendment 358 in the names of the noble Lords, Lord Vaux, Lord Young of Cookham and Lord Holmes of Richmond, and the noble Baroness, Lady Morgan of Cotes, who have made the case extremely well today. I pay tribute to the Fraud Act committee chaired by the noble Baroness, Lady Morgan, and I shall quote from it extensively in the next group.
This amendment would rightly ensure that the definition of a specified article included devices capable of using virtual subscriber identity modules, not just physical SIM cards. As we have heard, the criminal landscape evolves rapidly. If we legislate only for plastic SIMs, criminals will simply pivot to readily available virtual SIM technology. By incorporating virtual SIMs into the definition now, we will help to future-proof these provisions and make them genuinely effective against highly scalable, technology-enabled fraud.
Clauses 112 to 117 quite rightly seek to address the serious and growing problem of SIM farms being used at scale to perpetrate fraud and other abuses—it was very interesting to hear the quotes of the noble Lord, Lord Young, from the Select Committee’s report, which demonstrates that the problem has been with us for several years now—but, as drafted, Clause 114 risks being a technological step behind the criminals. As we have heard, it refers to devices capable of using physical SIM cards, but the market is already rapidly moving towards virtual or embedded SIMs. Indeed, I have an iPad in my hand that has a virtual SIM inside it—no physical SIM card at all. If the Bill focuses only on the plastic card and not the underlying functionality, it will leave an obvious loophole that organised criminals will quickly exploit.
The noble Baroness, Lady Morgan, spoke of “entrepreneurial” but not in a good way. We know that fraudsters are highly adaptive. As mobile operators deploy more robust controls on physical SIMs—I suspect not enough for the noble Lord, Lord Vaux—and as handsets and routers increasingly use eSIMs or other virtual identities, those intent on running industrial-scale smishing and scam operations will migrate to those platforms. If we legislate today for yesterday’s technology, we will simply displace the problem from one category of device to another and be back here in a few years’ time having the same debate. I hope the Minister will be able either to accept the amendment or to confirm that the Government will bring forward their own wording—there is always a bit of “not invented here” with these things. Without that assurance, there is a real risk that this part of the Bill will be lacking in force from the day it comes into effect.