Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Morris of Bolton
Main Page: Baroness Morris of Bolton (Conservative - Life peer)Department Debates - View all Baroness Morris of Bolton's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I am very grateful to the noble Baroness, Lady Harding, for the way she introduced this group of amendments. I have added my name to Amendment 125 and have tabled probing Amendments 241 and 301 in an attempt to future-proof the Bill. As the noble Baroness has said, this is not the future but today, tomorrow and forever, going forwards.
I hope that there are no children in the Public Gallery, but from my position I cannot see.
There are some children in the Public Gallery.
Then I shall slightly modify some of the things I was going to say.
When this Bill was conceived, the online world was very different from how it is today. It is hard to imagine how it will look in the future. I am very grateful to the noble Baroness, Lady Berridge, and the Dawes Centre for Future Crime at UCL, for information that they have given to me. I am also grateful to my noble friend Lady Kidron, and the enforcement officers who have shared with us images which are so horrific that I wish that I had never seen them—but you cannot unsee what you have seen. I admire how they have kept going and maintained a moral compass in their work.
The metaverse is already disrupting the online world as we know it. By 2024, it is estimated that there will be 1.7 billion mobile augmented-reality user devices worldwide. More than one-fifth of five to 10 year-olds already have a virtual reality headset of their own, or have asked for similar technology as a gift. The AI models are also developing quickly. My Amendment 241 would require Ofcom to be alert to the ways in which emerging technologies allow for activities that are illegal in the real world to be carried out online, to identify the places where the law is not keeping up to date with technological developments.
The metaverse seems to have 10 attributes. It is multiuser and multipurpose, content is user-generated, it is immersive, and spatial interactions occur in virtual reality or have physical environments enhanced by augmented reality. Its digital aspects do not expire when the experience ends, and it is multiplatform and interoperable, as users move between platforms. Avatars are involved, and in the metaverse there is ownership of the avatars or other assets such as virtual property, cryptocurrency et cetera. These attributes allow it to be used to master training scenarios of complex situations, such as in surgical training for keyhole surgery, where it can improve accuracy rapidly. On the horizon are brain/computer interfaces, which may be very helpful in rehabilitative adaptation after severe neurological damage.
These developments have great potential. However, dangers arise when virtual and augmented reality devices are linked to such things as wearable haptic suits, which allow the user to feel interactions through physical sensation, and teledildonics, which are electronic devices that simulate sexual interaction.
With the development of deep-fake imagery, it is now possible for an individual to order a VR experience of abusing the image of a child whom they know. The computer-generated images are so realistic that they are almost impossible to distinguish from those that would be cartoon-generated. An avatar can sexually assault the avatar of a minor, and such an avatar of the minor can be personalised. Worryingly, there have been growing reports of these assaults and rapes happening. Since the intention of VR is to trick the human nervous system into experiencing perceptual and bodily reactions, while such a virtual assault may not involve physical touching, the psychological, neurological and emotional experience can be similar to a physical assault.
This fuels sex addiction and violence addiction, and is altering the offender pathway: once the offender has engaged with VR abuse material, there is no desire to go back to 2D material. Offenders report that they want more: in the case of VR, that would be moving to live abuse, as has been said. The time from the development of abnormal sexual desires to real offending is shortened as the offender seeks ever-increasing and diverse stimulation to achieve the same reward. Through Amendment 125, such content would be regarded as user-generated.
Under Amendment 241, Ofcom could suggest ways in which Parliament may want to update the current law on child pornography to catch such deep-fake imagery, as these problematic behaviours are illegal in the real world but do not appear to be illegal online or in the virtual world.
Difficulties also arise over aspects of terrorism. It is currently a criminal offence to attend a terrorist training ground. Can the Minister confirm that Amendment 136C, which we have debated and which will be moved in a later group, would make attending a virtual training ground illegal? How will Ofcom be placed to identify and close any loopholes?
The Dawes Centre for Future Crime has identified 31 unique crime threats or offences which are risks in the metaverse, particularly relating to child sexual abuse material, child grooming, investment scams, hate crime, harassment and radicalisation.
I hope the Minister can confirm that the Bill already applies to the metaverse, with its definition of user-to-user services and technology-neutral terminology, and that its broad definition of “encountering” includes experiencing content such as haptic suits or virtual or augmented reality through the technology-neutral expression “or other automated tool”. Can the Minister also confirm that the changes made in the other place in Clause 85 require providers of metaverse services to consider the level of risk of the service being used for the commission or facilitation of a priority offence?
The welcome addition to the Bill of a risk assessment duty, however, should be broadened to include offences which are not only priority offences. I ask the Minister: will the list of offences in Schedules 5 to 7 to the Bill be amended to include the option of adding to this list to cover other harmful offences such as sexual offences against adults, impersonation scams, and cyber physical attacks such as cyber burglary, which can lead to planned burglary, attacks on key infrastructure and assault?
The ability to expand the risk assessment criteria could future-proof the Bill against such offences by keeping the list open, rather than closed as it is at the moment, to other serious offences committed in user-to-user or combined service providers. Such duties should apply across all services, not only those in category 1, because the smaller platforms, which are not covered by empowerment duties, may present a particularly high risk of illegal content and harmful behaviours.
Can the Minister therefore please tell us how content that is illegal in the real world will be reported, and how complaints can be made when it is encountered, if it is not a listed priority offence in the Bill? Will the Government expand the scope to cover not only illegal content, as defined in Clauses 207 and 53, but complex activities and interactions that are possible in the metaverse? How will the list of priority offences be expanded? Will the Government amend the Bill to enable Ofcom to take a risk-based approach to identifying who becomes classified as a category 1 provider?
I could go on to list many other ways in which our current laws will struggle to remain relevant against the emerging technologies. The list’s length shows the need for Ofcom to be able to act and report on such areas—and that Parliament must be alive to the need to stay up to date.