Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Finlay of Llandaff
Main Page: Baroness Finlay of Llandaff (Crossbench - Life peer)Department Debates - View all Baroness Finlay of Llandaff's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I am very grateful to the noble Baroness, Lady Harding, for the way she introduced this group of amendments. I have added my name to Amendment 125 and have tabled probing Amendments 241 and 301 in an attempt to future-proof the Bill. As the noble Baroness has said, this is not the future but today, tomorrow and forever, going forwards.
I hope that there are no children in the Public Gallery, but from my position I cannot see.
There are some children in the Public Gallery.
Then I shall slightly modify some of the things I was going to say.
When this Bill was conceived, the online world was very different from how it is today. It is hard to imagine how it will look in the future. I am very grateful to the noble Baroness, Lady Berridge, and the Dawes Centre for Future Crime at UCL, for information that they have given to me. I am also grateful to my noble friend Lady Kidron, and the enforcement officers who have shared with us images which are so horrific that I wish that I had never seen them—but you cannot unsee what you have seen. I admire how they have kept going and maintained a moral compass in their work.
The metaverse is already disrupting the online world as we know it. By 2024, it is estimated that there will be 1.7 billion mobile augmented-reality user devices worldwide. More than one-fifth of five to 10 year-olds already have a virtual reality headset of their own, or have asked for similar technology as a gift. The AI models are also developing quickly. My Amendment 241 would require Ofcom to be alert to the ways in which emerging technologies allow for activities that are illegal in the real world to be carried out online, to identify the places where the law is not keeping up to date with technological developments.
The metaverse seems to have 10 attributes. It is multiuser and multipurpose, content is user-generated, it is immersive, and spatial interactions occur in virtual reality or have physical environments enhanced by augmented reality. Its digital aspects do not expire when the experience ends, and it is multiplatform and interoperable, as users move between platforms. Avatars are involved, and in the metaverse there is ownership of the avatars or other assets such as virtual property, cryptocurrency et cetera. These attributes allow it to be used to master training scenarios of complex situations, such as in surgical training for keyhole surgery, where it can improve accuracy rapidly. On the horizon are brain/computer interfaces, which may be very helpful in rehabilitative adaptation after severe neurological damage.
These developments have great potential. However, dangers arise when virtual and augmented reality devices are linked to such things as wearable haptic suits, which allow the user to feel interactions through physical sensation, and teledildonics, which are electronic devices that simulate sexual interaction.
With the development of deep-fake imagery, it is now possible for an individual to order a VR experience of abusing the image of a child whom they know. The computer-generated images are so realistic that they are almost impossible to distinguish from those that would be cartoon-generated. An avatar can sexually assault the avatar of a minor, and such an avatar of the minor can be personalised. Worryingly, there have been growing reports of these assaults and rapes happening. Since the intention of VR is to trick the human nervous system into experiencing perceptual and bodily reactions, while such a virtual assault may not involve physical touching, the psychological, neurological and emotional experience can be similar to a physical assault.
This fuels sex addiction and violence addiction, and is altering the offender pathway: once the offender has engaged with VR abuse material, there is no desire to go back to 2D material. Offenders report that they want more: in the case of VR, that would be moving to live abuse, as has been said. The time from the development of abnormal sexual desires to real offending is shortened as the offender seeks ever-increasing and diverse stimulation to achieve the same reward. Through Amendment 125, such content would be regarded as user-generated.
Under Amendment 241, Ofcom could suggest ways in which Parliament may want to update the current law on child pornography to catch such deep-fake imagery, as these problematic behaviours are illegal in the real world but do not appear to be illegal online or in the virtual world.
Difficulties also arise over aspects of terrorism. It is currently a criminal offence to attend a terrorist training ground. Can the Minister confirm that Amendment 136C, which we have debated and which will be moved in a later group, would make attending a virtual training ground illegal? How will Ofcom be placed to identify and close any loopholes?
The Dawes Centre for Future Crime has identified 31 unique crime threats or offences which are risks in the metaverse, particularly relating to child sexual abuse material, child grooming, investment scams, hate crime, harassment and radicalisation.
I hope the Minister can confirm that the Bill already applies to the metaverse, with its definition of user-to-user services and technology-neutral terminology, and that its broad definition of “encountering” includes experiencing content such as haptic suits or virtual or augmented reality through the technology-neutral expression “or other automated tool”. Can the Minister also confirm that the changes made in the other place in Clause 85 require providers of metaverse services to consider the level of risk of the service being used for the commission or facilitation of a priority offence?
The welcome addition to the Bill of a risk assessment duty, however, should be broadened to include offences which are not only priority offences. I ask the Minister: will the list of offences in Schedules 5 to 7 to the Bill be amended to include the option of adding to this list to cover other harmful offences such as sexual offences against adults, impersonation scams, and cyber physical attacks such as cyber burglary, which can lead to planned burglary, attacks on key infrastructure and assault?
The ability to expand the risk assessment criteria could future-proof the Bill against such offences by keeping the list open, rather than closed as it is at the moment, to other serious offences committed in user-to-user or combined service providers. Such duties should apply across all services, not only those in category 1, because the smaller platforms, which are not covered by empowerment duties, may present a particularly high risk of illegal content and harmful behaviours.
Can the Minister therefore please tell us how content that is illegal in the real world will be reported, and how complaints can be made when it is encountered, if it is not a listed priority offence in the Bill? Will the Government expand the scope to cover not only illegal content, as defined in Clauses 207 and 53, but complex activities and interactions that are possible in the metaverse? How will the list of priority offences be expanded? Will the Government amend the Bill to enable Ofcom to take a risk-based approach to identifying who becomes classified as a category 1 provider?
I could go on to list many other ways in which our current laws will struggle to remain relevant against the emerging technologies. The list’s length shows the need for Ofcom to be able to act and report on such areas—and that Parliament must be alive to the need to stay up to date.
My Lords, I am grateful to the noble Baroness, Lady Finlay of Llandaff, for tempering her remarks. On tempering speeches and things like that, I can inform noble Lords that the current school group have been escorted from the Chamber, and no further school groups will enter for the duration of the debate on this group of amendments.
My Lords, I apologise to my noble friend. I ask that we pause the debate to ask this school group to exit the Chamber. We do not think that the subject matter and content will be suitable for that audience. I am very sorry. The House is pausing.
In this moment while we pause, I congratulate the noble Lord, the Government Whip, for being so vigilant: some of us in the Chamber cannot see the whole Gallery. It is appreciated.
I, too, thank my noble friend the Government Whip. I apologise too if I have spoken out of discourtesy in the Committee: I was not sure whose name was on which amendment, so I will continue.
Physically, I am, of course, working in my home. If that behaviour had happened in the office, it would be an offence, an assault: “intentional or reckless application of unlawful force to another person”. It will not be an offence in the metaverse and it is probably not harassment because it is not a course of conduct.
Although the basic definition of user-to-user content covers the metaverse, as does encountering, as has been mentioned in relation to content under Clause 207, which is broad enough to cover the haptic suits, the restriction to illegal content could be problematic, as the metaverse is a complex of live interactions that mimics real life and such behaviours, including criminal ones. Also, the avatar of an adult could sexually assault the avatar of a child in the metaverse, and with haptic technologies this would not be just a virtual experience. Potentially even more fundamentally than Amendment 125, the Bill is premised on the internet being a solely virtual environment when it comes to content that can harm. But what I am seeking to outline is that conduct can also harm.
I recognise that we cannot catch everything in this Bill at this moment. This research is literally hot off the press; it is only a few weeks old. At the very least, it highlights the need for future-proofing. I am aware that some of the issues I have highlighted about the fundamental difference between conduct and content refer to clauses noble Lords may already have debated. However, I believe that these points are significant. It is just happenstance that the research came out and is hot off the press. I would be grateful if the Minister would meet the Dawes Centre urgently to consider whether there are further changes the Government need to make to the Bill to ensure that it covers the harms I have outlined.
This is a very interesting discussion; the noble Lord, Lord Knight, has hit on something really important. When somebody does an activity that we believe is criminal, we can interrogate them and ask how they came to do it and got to the conclusion that they did. The difficulty is that those of us who are not super-techy do not understand how you can interrogate a bot or an AI which appears to be out of control on how it got to the conclusion that it did. It may be drawing from lots of different places and there may be ownership of lots of different sources of information. I wonder whether that is why we are finding how this will be monitored in future so concerning. I am reassured that the noble Lord, Lord Knight of Weymouth, is nodding; does the Minister concur that this may be a looming problem for us?
I certainly concur that we should discuss the issue in greater detail. I am very happy to do so with the noble Lord, the noble Baroness and others who want to do so, along with officials. If we can bring some worked examples of what “in control” and “out of control” bots may be, that would be helpful.
I hope the points I have set out in relation to the other issues raised in this group and the amendments before us are satisfactory to noble Lords and that they will at this point be content not to press their amendments.