Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Lords ChamberMy Lords, I shall attempt to be brief but, based on previous experience with other speakers, that may be difficult. At least it gives the Whip on the Front Bench the chance to do some agile body moves.
I welcome this overdue Bill. I think the Minister got it slightly wrong when he congratulated us on waiting patiently for it. Judging by every single contribution around the entire House today, patience has been rather wanting. We want to get on with it. Like many government Bills, this has grown like Topsy. It has grown sideways, downwards and upwards. We need to beware of going around in circles. Above all, we need to expedite this and get it on the statute book.
I will focus on three key areas. Unsurprisingly, the first will be children. Here I declare that I am a governor of Coram, the oldest children’s charity in the United Kingdom. I will certainly support amendments such as those that the noble Lord, Lord Bethell, was talking about to try to bring in proper age verification.
Like many other noble Lords, on Monday I had the privilege of sitting in on the briefing that the noble Baroness, Lady Kidron, arranged. Ian Russell, the father of Molly Russell, was present, together with one of her sisters. What we saw was truly shocking. In some ways it was particularly shocking to me because, as Ian shared some of his daughter’s diary—what she had actually written in the days and weeks before she died—I had a sudden jolt of recognition. What 14 year-old Molly was saying was almost identical to the transcript of the suicide note that my father wrote to my mother, which I have in my desk at home. It has the same self-loathing, the feeling of worthlessness and the belief—completely wrong—that you would better serve those you love and live with by departing from this life. My father was a Second World War veteran who had won the Military Cross. He was suffering from manic depression and was clearly in a depressed state, but I cannot even begin to imagine the effect it must have had on Molly to have the deluge of filthy, negative, awful, harmful content that she was deluged in 24 hours a day. Perversely, the more she looked at it, the more excited the algorithm got and the more she received.
Particularly disgraceful is that it took no less than five years for the family and their lawyer finally to get some of the platforms Molly had been watching to disgorge and show some of the content she had been viewing. Five years is wholly and utterly unacceptable.
I take the point that the noble Baroness, Lady Bennett, made about young people being involved. It would be a good idea for Ofcom in some way, shape or form to have access to young people advising it. I support in principle the idea of a Joint Committee of Parliament. Again, I think it would be very helpful to have young people advising that.
The second area is supporting the wonderful noble Baroness, Lady Kidron. I declare quite openly that I am a Beebanite. I think there are quite a few of us in the House, and we will do everything we can to support the wonderful noble Baroness in everything she does.
Lastly, I come to the companies. I speak as somebody who was a head-hunter for 30 years. A large part of our business was in North America and—surprise, surprise—a lot of our most wonderful clients were some of these new tech giants. I know a lot because of that about what I would call the psychology of attraction and repulsion. I can tell the House that for many years, on going to a candidate and saying, “Would you like to join Facebook? Would you like to join one of these other companies?”, they would get pretty excited, because it is new technology, there is a lot of money, it is sexy, it is probably in California—what could be better?
We have to change the paradigm in which people look at potentially being employed by those companies. We have to create a frisson of fear and forethought that, if they do join forces with those companies, not only might their personal reputation suffer but the reputation of the company will suffer, shareholders will suffer, and those who provide services to that company, be they banks or lawyers, will also suffer. That is what we need to change. I will do everything I can, working with others who probably know rather more about this than I do, to concentrate on getting into the minds of those companies, which have huge resources, legal and financial, to resist whatever we do. We have to get inside their minds, find their weak points and go for the jugular.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 8 months ago)
Lords ChamberMy Lords, first, I am relieved to hear that I am not the only thick person in this Committee, because I have struggled to understand and follow the detail and interconnectedness of everything in the Bill. The maxim that you need simplicity and clarity, especially if the Bill is going to be effective, is really important. That is why I think this amendment is a no-brainer: just set it out at the front.
Secondly, the amendment provides a guideline, or a lens through which we read the complexity of what follows. That might even lead us, as we go through some of the detail, to strip stuff out and make it simpler for everybody to understand. It does not have to grow the extent of the Bill. It might help us to be—I think this is the most important word I have heard—disciplined as we proceed. I support the amendment.
My Lords, I suggest, very briefly, that we look at this amendment in a slightly different way. Understandably, we have a tendency in Parliament to look at things through our own lens, and perhaps some of us are viewing this amendment as a reminder of what the Bill is about.
The noble Baroness, Lady Harding, made a very good point about clarity. I suggest we imagine that we are one of the companies that the Bill is designed to try to better manage. Imagine you are in the boardroom, or on the executive management team, and you are either already doing business in the United Kingdom or are considering entering the UK market. You know there is an enormous piece of legislation that is designed to try to bring some order to the area your business is in. At the moment, without this amendment, the Bill is a lawyer’s paradise, because it can be looked at in a multitude of ways. I put it to the Minister and the Bill team that it would be extremely helpful to have something in the Bill that makes it completely clear, to any business thinking of engaging in any online activities in the United Kingdom, what this legislation is about.
My Lords, I am one of those who found the Bill extremely complicated, but I do not find this amendment extremely complicated. It is precise, simple, articulate and to the point, and I think it gives us a good beginning for debating what is an extremely complex Bill.
I support this amendment because I believe, and have done so for a very long time, that social media has done a great deal more harm than good, even though it is capable of doing great good. Whether advertently or inadvertently, the worst of all things it has done is to destroy childhood innocence. We are often reminded in this House that the prime duty of any Government is to protect the realm, and of course it is. But that is a very broad statement. We can protect the realm only if we protect those within it. Our greatest obligation is to protect children—to allow them to grow up, so far as possible, uncorrupted by the wicked ways of a wicked world and with standards and beliefs that they can measure actions against. Complex as it is, the Bill is a good beginning, and its prime purpose must be the protection and safeguarding of childhood innocence.
The noble Lord, Lord Griffiths of Burry Port, spoke a few moments ago about the instructions he was given as a young preacher. I remember when I was training to be a lay reader in the Church of England, 60 or more years ago, being told that if you had been speaking for eight minutes and had not struck oil, stop boring. I think that too is a good maxim.
We have got to try to make the Bill comprehensible to those around the country whom it will affect. The worst thing we do, and I have mentioned this in connection with other Bills, is to produce laws that are unintelligible to the people in the country; that is why I was very sympathetic to the remarks of my noble friend Lord Inglewood. This amendment is a very good beginning. It is clear and precise. I think nearly all of us who have spoken so far would like to see it in the Bill. I see the noble Baroness, Lady Fox, rising—does she wish to intervene?
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, as I listen to the words echoing around the Chamber, I try to put myself in the shoes of parents or children who, in one way or another, have suffered as a result of exposure to things happening online. Essentially, the world that we are talking about has been allowed to grow like Topsy, largely unregulated, at a global level and at a furious pace, and that is still happening as we do this. The horses have not just bolted the stable; they are out of sight and across the ocean. We are talking about controlling and understanding an environment that is moving so quickly that, however fast we move, we will be behind it. Whatever mousetraps we put in place to try to protect children, we know there are going to be loopholes, not least because children individually are probably smarter than we are collectively at knowing how to get around well-meaning safeguards.
There are ways of testing what is happening. Certain organisations have used what they term avatars. Essentially, you create mythical profiles of children, which are clearly stated as being children, and effectively let them loose in the online world in various directions on various platforms and observe what happens. The tests that have been done on this—we will go into this in more detail on Thursday when we talk about safety by design—are pretty eye-watering. The speed with which these avatars, despite being openly stated as being profiles of children, are deluged by a variety of content that should be nowhere near children is dramatic and incredibly effective.
I put it to the Minister and the Bill team that one of the challenges for Ofcom will be not to be so far behind the curve that it is always trying to catch up. It is like being a surfer: if you are going to keep going then you have to keep on the front side of the wave. The minute you fall behind it, you are never going to catch up. I fear that, however well-intentioned so much of the Bill is, unless and until His Majesty’s Government and Ofcom recognise that we are probably already slightly behind the crest of the wave, whatever we try to do and whatever safeguards we put in place are not necessarily going to work.
One way we can try to make what we do more effective is the clever, forensic use of approaches such as avatars, not least because I suspect their efficacy will be dramatically increased by the advent and use of AI.
Tim Cook, the CEO of Apple, put it very well:
“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.
The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.
There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.
What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.
There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.
To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.
This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.
These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.
There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.
In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.
The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.
This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.
The amendment states that the Bill should target any platform that posts
“links to, or … encourages child users to seek”
out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.
To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, this large group of 33 amendments is concerned with preventing harm to children, by creating a legal requirement to design the sites and services that children will access in a way that will put their safety first and foremost. I thank my co-sponsors, the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lord, Lord Knight. First of all, I wish to do the most important thing I will do today: to wish the noble Baroness, Lady Kidron, a very happy birthday.
My co-sponsors will deal with some of the more detailed elements of the 30 amendments that we are dealing with. These will include safety duties, functionality and harm, and codes of practice. I am sure that the noble Lords, Lord Stevenson and Lord Knight, and the right reverend Prelate the Bishop of Oxford will speak to their own amendments.
I will provide a brief overview of why we are so convinced of the paramount need for a safety by design approach to protect children and remind digital companies and platforms, forcibly and legally, of their obligation to include the interests and safety of children as a paramount element within their business strategies and operating models. These sites and services are artificial environments. They were designed artificially and can be redesigned artificially.
In her testimony to the US Senate in July 2021, the Facebook whistleblower Frances Haugen put her finger on it rather uncomfortably when talking about her erstwhile employer:
“Facebook know that they are leading young users to anorexia content … Facebook’s internal research is aware that there are a variety of problems facing children on Instagram … they know that severe harm is happening to children”.
She was talking about, probably, three years ago.
On the first day of Committee, the noble Lord, Lord Allan, who is not with us today, used the analogy of the legally mandated and regulated safe design of aeroplanes and automobiles and the different regimes that cover their usage to illustrate some of our choices in dealing with regulation. We know why aeroplanes and cars have to be designed safely; we also know that either form of transportation could be used recklessly and dangerously, which is why we do not allow children to fly or drive them.
First, let us listen to the designers of these platforms and services through some research done by the 5Rights Foundation in July 2021. These are three direct quotes from the designers:
“Companies make their money from attention. Reducing attention will reduce revenue. If you are a designer working in an attention business, you will design for attention … Senior stakeholders like simple KPIs. Not complex arguments about user needs and human values … If a senior person gives a directive, say increase reach, then that’s what designers design for without necessarily thinking about the consequences”.
Companies know exactly what they need to do to grow and to drive profitability. However, they mostly choose not to consider, mitigate and prioritise to avoid some of the potentially harmful consequences. What they design and prioritise are strategies to maximise consumption, activity and profitability. They are very good at it.
Let us hear what the children say, remembering that some recent research indicates that 42% of five to 12 year-olds in this country use social media. The Pathways research project I referred to earlier worked closely with 21 children aged 12 to 18, who said: “We spend more time online than we feel we should, but it’s tough to stop or cut down”. “If we’re not on social media, we feel excluded”. “We like and value the affirmations and validations we receive”. “We create lots of visual content, much of it about ourselves, and we share it widely”. “Many of us are contacted by unknown adults”. “Many of us recognise that, through using social media, we have experienced body image and relationships problems”.
To test whether the children in this research project were accurately reporting their experiences, the project decided to place a series of child avatars—ghost children, in effect—on the internet, whose profiles very clearly stated that they were children. It did this to test whether these experiences were true.
They found—in many cases within a matter of hours of the profiles going online—proactive contacting by strangers and rapid recommendations to engage more and more. If searches were conducted for eating disorders or self-harm, the avatars were quickly able to access content irrespective of their stated ages and clearly evident status as children. At the same time they were being sent harmful or inappropriate content, they also received age-relevant advertising for school revision and for toys—the social media companies knew that these accounts were registered as children.
This research was done two years ago. Has anything improved since then? It just so happens that 5Rights has produced another piece of research which is about to be released, and which used the exact same technique—creating avatars to see what they would experience online. They used 10 avatars based on real children aged between 10 and 16, so what happened? For an 11 year-old avatar, Instagram was recommending images of knives with the caption “This is what I use to self-harm”; design features were leading children from innocent searches to harmful content very quickly.
I think any grandparents in the Chamber will be aware of an interesting substance known as “Slime”—a form of particularly tactile playdough which one’s grandchildren seem to enjoy. Typing in “Slime” on Reddit was one search, and one click, away from pornography; exactly the same thing happened on Reddit when the avatar typed in “Minecraft”, another very popular game with our children or grandchildren. A 15 year-old female avatar was private-messaged on Instagram by a user that she did not follow—an unknown adult who encouraged her to link on to pornographic content on Telegram, another instant messaging service. On the basis of this evidence, it appears that little or nothing has changed; it may have even got slightly worse.
By an uncomfortable coincidence, last week, Meta, the parent company of Facebook and Instagram, published better than expected results and saw its market value increase by more than $50 billion in after-hours trading. Mark Zuckerberg, the founder of Meta, proudly announced that Meta is pouring investment into artificial intelligence tools to make its platform more engaging and its advertising more effective. Of particular interest and concern given the evidence of the avatars was his announcement that since the introduction of Reels, a short-term video feed designed specifically to respond to competition from TikTok, its AI-driven recommendations had boosted the average time people spend on Instagram by 24%.
To return to the analogy of planes and cars used by the noble Lord, Lord Allan, we are dealing here with planes and cars in the shape of platforms and applications which we know are flawed in their design. They are not adequately designed for safety, and we know that they can put users, particularly children and young people, in the way of great harm, as many grieving families can testify.
In conclusion, our amendments propose that companies must design digital services that cater for the vulnerabilities, needs, and rights of children and young people by default; children’s safety cannot and must not be an afterthought or a casualty of their business models. We are asking for safety by design to protect children to become the mandatory standard. What we have today is unsafe design by default, driven by commercial strategies which can lead to children becoming collateral damage.
Given that it is the noble Baroness’s birthday, I am sure we can feel confident that the Minister will have a positive tone when he replies. I beg to move.
My Lords, I was not going to speak on this group, but I was provoked into offering some reflections on the speech by the noble Lord, Lord Russell of Liverpool, especially his opening remarks about cars and planes, which he said were designed to be safe. He did not mention trains, about which I know something as well, and which are also designed to be safe. These are a few initial reflective points. They are designed in very different ways. An aeroplane is designed never to fail; a train is designed so that if it fails, it will come to a stop. They are two totally different approaches to safety. Simply saying that something must be designed to be safe does not answer questions; it opens questions about what we actually mean by that. The noble Lord went on to say that we do not allow children to drive cars and fly planes. That is absolutely true, but the thrust of his amendment is that we should design the internet so that it can be driven by children and used by children— so that it is designed for them, not for adults. That is my problem with the general thrust of many of these amendments.
A further reflection that came to mind as the noble Lord spoke was on a book of great interest that I recommend to noble Lords. It is a book by the name of Risk written in 1995 by Professor John Adams, then professor of geography at University College London. He is still an emeritus professor of geography there. It was a most interesting work on risk. First, it reflected how little we actually know of many of the things of which we are trying to assess risk.
More importantly, he went on to say that people have an appetite for risk. That appetite for risk—that risk budget, so to speak—changes over the course of one’s life: one has much less appetite for risk when one gets to a certain age than perhaps one had when one was young. I have never bungee jumped in my life, and I think I can assure noble Lords that the time has come when I can say I never shall, but there might have been a time when I was younger when I might have flung myself off a cliff, attached to a rubber band and so forth—noble Lords may have done so. One has an appetite for risk.
The interesting thing that he went on to develop from that was the notion of risk compensation: that if you have an appetite for risk and your opportunities to take risks are taken away, all you do is compensate by taking risks elsewhere. So a country such as New Zealand, which has some of the strictest cycling safety laws, also has a very high incidence of bungee jumping among the young; as they cannot take risks on their bicycles, they will find ways to go and do it elsewhere.
Although these reflections are not directly germane to the amendments, they are important as we try to understand what we are seeking to achieve here, which is a sort of hermetically sealed absence of risk for children. I do not think it will work. I said at Second Reading that I thought the flavour of the debate was somewhat similar to a late medieval conclave of clerics trying to work out how to mitigate the harmful effects of the invention of movable type. That did not work either, and I think we are in a very similar position today as we discuss this.
There is also the question of harm and what it means. While the examples being given by noble Lords are very specific and no doubt genuinely harmful, and are the sorts of things that we should like to stop, the drafting of the amendments, using very vague words such as “harm”, is dangerous overreach in the Bill. To give just one example, for the sake of speed, when I was young, administering the cane periodically was thought good for a child in certain circumstances. The mantra was, “Spare the rod and spoil the child”, though I never heard it said. Nowadays, we would not think it morally or psychologically good to do physical harm to a child. We would regard it as an unmitigated harm and, although not necessarily banned or illegal, it is something that—
My Lords, I respond to the noble Lord in two ways. First, I ask him to reflect on how the parents of the children who have died through what the parents would undoubtedly view as serious and unbearable harm would feel about his philosophical ruminations. Secondly, as somebody who has the privilege of being a Deputy Speaker in your Lordships’ House, it is incumbent and germane for us all to focus on the amendment in question and stay on it, to save time and get through the business.
Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—
My Lords, I thank the Minister for his response. I think the entire Chamber will be thankful that I do not intend to respond in any great detail to almost one hour and three-quarters of debate on this series of amendments—I will just make a few points and suggestions.
The point that the noble Baroness made at the beginning about understanding the design and architecture of the systems and processes is fundamental, both for understanding why they are causing the sorts of harm that they are at the moment and for trying to ensure that they are designed better in future than they have been to date. Clearly, they are seriously remiss in the harms that they are inflicting on a generation of young people.
On the point made by the noble Baroness, Lady Harding, about trying to make Ofcom’s job easier— I can see the noble Lord, Lord Grade, in the corner— I would hope and anticipate that anything we could suggest that would lead the Government to make Ofcom’s job slightly easier and clearer would be very welcome. The noble Lord appears to be making an affirmatory gesture, so I will take that as a yes.
I say to the noble Lord, Lord Moylan, that I fully understand the importance of waving the flag of liberty and free speech, and I acknowledge its importance. I also acknowledge the always-incipient danger of unintentionally preventing things from happening that can and should happen when you are trying to make things safer and prevent harm. Trying to get the right balance is extraordinarily difficult, but I applaud the noble Lord for standing up and saying what he said. If one were to judge the balance of the contributions here as a very rough opinion poll, the noble Lord might find himself in the minority, but that does not necessarily mean that he is wrong, so I would encourage him to keep contributing.
I sympathise with the noble Baroness, Lady Fox, in trying to find the right balance; it is something that we are all struggling to do. One of the great privileges we have in this House is that we have the time to do it in a manner which is actively discouraged in the other place. Even if we go on a bit, we are talking about matters which are very important—in particular, the pre-legislative scrutiny committee was able to cover them in greater detail than the House of Commons was able to do.
The noble Lord, Lord Clement-Jones, was right. In the same way as they say, “Follow the money”, in this case it is “follow the algorithms”, because it is the algorithms which drive the business model.
On the points made by the noble Lord, Lord Knight, regarding the New York Times article about Geoffrey Hinton, one of the architects of AI in Google, I would recommend that all your Lordships read it to see somebody who has been at the forefront of developing artificial intelligence. Rather like a character in a Jules Verne novel suddenly being slightly aghast at what they have created—Frankenstein comes to mind—it makes one pause for thought. Even as we are talking about these things, AI is racing ahead like a greyhound in pursuit of a very fast rabbit, and there is no way that we will be able to catch up.
While I thank the noble Minister for his reply, as when we debated some of the amendments last week where the noble Baroness, Lady Harding, spoke about the train journey she took when she was trying to interrogate and interpret the different parts of the Bill and was trying to follow the trail and understand what was going on to the extent that she became so involved that she missed her station, I think there is a real point here about the fact that this Bill is very complex to follow and understand. Indeed, the way in which the Minster had to point to all the different points of the compass—so to speak—both within the Bill and without it in many of the answers that he gave to some of the amendments indicates to me that the Bill team is finding it challenging to respond to some of them. It is like filling in one of those diagrams where you join the dots, and you cannot quite see what it is until you have nearly finished. I find it slightly disturbing if the Bill team and some of the officials appear to be having a challenging time in trying to interpret, understand and explain some of the points we are raising; I would hope and expect that that could be done much more simply.
One of the pleas from all of us in a whole variety of these amendments is to get the balance right between legislating what it is that we want to legislate and making it simple enough to be understandable. At the moment, a criticism of this Bill is that it is extraordinary difficult to understand in many parts. I will not go through all the points, but there are some germane areas where it would be extremely helpful to pursue with the Minister and the Bill team some of the points we are trying to make. Many of them are raised by a variety of outside bodies which know infinitely more about it than I do, and which have genuine concerns. We have the time between Committee and Report to put some of those to bed or at least to understand them better than we do at the moment. We will probably be happy and satisfied with some of the responses that we receive from the department once we feel that we understand them, and perhaps more importantly, once we feel that the department and the Bill team themselves fully understand them. It is fair to say that at the moment we are not completely comfortable that they do. I do not blame the Minister for that. If I were in his shoes, I would be on a very long holiday and I would not be returning any time soon. However, we will request meetings—for one meeting, it would be too much, so we will try to put this into bit-size units and then try to dig into the detail in a manageable way without taking too much time to make sure that we understand each other.
With that, I beg leave to withdraw the amendment.
My Lords, I am sorry that it is me again—a bit like a worn 78. In moving Amendment 25, I will speak also to Amendments 78, 187 and 196, all of which speak to the principle of children’s rights as set out in the UN Convention on the Rights of the Child and, more specifically, how those rights are applied to the digital world as covered in the United Nations’ general comment No. 25, which was produced in 2021 and ratified by the UK Government. What we are suggesting and asking for is that the principles in this general comment are reflected in the Bill. I thank the noble Baronesses, Lady Harding, Lady Kennedy and Lady Bennett, and the noble Lord, Lord Alton—who is not with us—for adding their names to these amendments and for their support.
The general comment No. 25 that I mentioned recognises that children’s rights are applicable in the digital world as well as the real world. These amendments try to establish in the Bill the rights of children. Believe it or not, in this rather lengthy Bill there is not a single reference—as far as we can discern—specifically to children’s rights. There are a lot of other words, but that specific phrase is not used, amazingly enough. These amendments are an attempt to get children’s rights specifically into the Bill. Amendments 30 and 105 in the names of the noble Lords, Lord Clement-Jones and Lord Knight, also seek to preserve the well-being of children. Our aims are very similar, but we will try to argue that the convention would achieve them in a particularly effective and concise way.
The online world is not optional for children, given what we know—not least from some of the detailed and harrowing experiences related by various of your Lordships in the course of the Bill. The fact that the online world is not optional for children may be worrying to some adults. We have all heard about parents, grandparents and others who have direct experience of their beloved coming to harm. By contrast, it is also fascinating to note how many senior executives, and indeed founders, of digital companies forbid their own children from possessing and using mobile phones, typically until they are 12 or 14. That is telling us something. If they themselves do not allow their children to have access to some of the online world we are talking about so much, that should give us pause for reflection.
Despite the many harms online, there is undoubted good that all children can benefit from, including in terms of their cognitive and skills development, social development and relationships. There are some brilliant things which come from being online. It is also beneficial because having age-appropriate experiences when they are online is part of their fundamental rights. That, essentially, is what these amendments are about.
Throughout the many years that the Bill has been in gestation, we have heard a lot about freedom of speech and how it must be preserved. Indeed, in contrast to children’s rights not being mentioned once in the Bill, “freedom of expression” appears no less than 49 times. I venture to suggest to your Lordships that there is a degree of imbalance there which should cause us to pause and reflect on whether we have that balance quite right.
I will not go into detail, but the UNCRC is the most widely ratified human rights treaty in history, and it is legally binding on the states which are party to it. The UK is a signatory to this convention, yet if we do not get this right in the Bill, we are in danger of falling behind some of our global counterparts. Although I recognise that saying the name of this organisation may bring some members of the governing party out in a rather painful rash, the EU is incorporating the UNCRC into its forthcoming AI Act. Sweden has already incorporated it into law at a different level, and Canada, New Zealand and South Africa are all doing the same. It is not anything to be worried about. Even Wales incorporated it into its domestic law in 2004, and Scotland did so in 2021. This appears to be something that the English have a particular problem with.
My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.
Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.
As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.
Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.
The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.
The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.
Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.
More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.
My Lords, I thank all noble Lords for taking part in this discussion. I thank the noble Lord, Lord Weir, although I would say to him that his third point—that, in his experience, the UNCRC is open to different interpretations by different departments—is my experience of normal government. Name me something that has not been interpreted differently by different departments, as it suits them.
I entirely take that point. I was making the slightly wider point—not specifically with regard to the UNCRC—that, whenever legislative provision has been made that a particular department has to have due regard to something, while there is case law, “due regard” has tended to be treated very differently by different departments. So, if even departments within the same Government treat that differently, how much more differently would private companies treat it?
I would simply make the point that it would probably be more accurate to say that the departments treat it with “due disregard”;
This has been a wide ranging debate and I am not going to go through all the different bits and pieces. I recommend that noble Lords read United Nations general comment 25 as it goes, in great detail, right to the heart of the issues we are talking about. For example —this is very pertinent to the next group of amendments—it explicitly protects children from pornography, so I absolutely recommend that it be mentioned in the next group of amendments.
As I expected, the Minister said, “We are very sympathetic but this is not really necessary”. He said that children’s rights are effectively baked into the Bill already. But what is baked into something that children—for whom this is particularly relevant—or even adults might decide to consume is not always immediately obvious. There are problems with an approach whereby one says, “It’s fine because, if you really understood this rather complicated legislation, it would become completely clear to you what it means”. That is a very accurate and compelling demonstration of exactly why some of us have concerns about this well-intentioned Bill. We fear that it will become a sort of feast, enabling company lawyers and regulators to engage in occasionally rather arcane discourse at great expense, demonstrating that what the Government claim is clearly baked in is not so clearly baked in.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I indicate my support in principle for what these amendments are trying to achieve.
I speak with a background that goes back nearly 40 years, being involved in health education initiatives, particularly in primary schools. For 24 years—not very good corporate governance—I was the chair of what is now the largest supplier of health education into primary schools in the United Kingdom, reaching about 500,000 children every year.
The principle of preventive health is not a million miles away from what we are talking about today. I take the point that was well made by the noble Baroness, Lady Fox, that piling more and more duties on Ofcom in a well-intentioned way may not have the effect that we want. What we are really looking for and talking about is a joined-up strategy—a challenge for any Government—between the Department for Education, the Department for Digital, Culture, Media and Sport, the Department for Science, Innovation and Technology, and probably the Department of Health and Social Care, because health education, as it has developed over the last 40 or 50 years, has a lot to teach us about how we think about creating effective preventive education.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberMy Lords, I also put my name to Amendments 250A and 250B, but the noble Baronesses, Lady Newlove and Lady Kidron, have done such a good job that I shall be very brief.
To understand the position that I suspect the Government may put forward, I suggest one looks at Commons Hansard and the discussion of this in the seventh Committee sitting of 9 June last year. To read it is to descend into an Alice in Wonderland vortex of government contradictions. The then Digital Minister—a certain Chris Philp, who, having been so effective as Digital Minister, was promoted, poor fellow, to become a Minister in the Home Office; I do not know what he did to deserve that—in essence said that, on the one hand, this problem is absolutely vast, and we all recognise that. When responding to the various Members of the Committee who put forward the case for an independent appeals mechanism, he said that the reason we cannot have one is that the problem is too big. So we recognise that the problem is very big, but we cannot actually do anything about it, because it is too big.
I got really worried because he later did something that I would advise no Minister in the current Government ever to do in public. Basically, he said that this
“groundbreaking and world-leading legislation”—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 296.]
will fix this. If I ruled the world, if any Minister in the current Government said anything like that, they would immediately lose the Whip. The track record of people standing up and proudly boasting how wonderful everything is going to be, compared to the evidence of what actually happens, is not a track record of which to be particularly proud.
I witnessed, as I am sure others did, the experience of the noble Baroness, Lady Kidron, pulling together a group of bereaved parents: families who had lost their child through events brought about by the online world. A point that has stayed with me from that discussion was the noble Baroness, Lady Kidron, who was not complaining, saying at the end that there is something desperately wrong with the system where she ends up as the point person to try to help these people resolve their enormous difficulties with these huge companies. I remind noble Lords that the family of Molly Russell, aided by a very effective lawyer, took no less than five years to get Meta to actually come up with what she was looking at online. So the most effective complaints process, or ombudsman, was the fact they were able to have a very able lawyer and an exceptionally able advocate in the shape of the noble Baroness, Lady Kidron, helping in any way she could. That is completely inadequate.
I looked at the one of the platforms that currently helps individual users—parents—trying to resolve some of the complaints they have with companies. It is incredibly complicated. So relying on the platforms themselves to bring forward, under the terms of the Bill, completely detailed systems and processes to ensure that these things do not happen, or that if there is a complaint it will be followed up dutifully and quickly, does not exactly fill me with confidence, based on their previous form.
For example, as a parent or an individual, here are some of the questions you might have to ask yourself. How do I report violence or domestic abuse online? How do I deal with eating disorder content on social media? How do I know what is graphic content that does not breach terms? How do I deal with abuse online? What do I do as a UK citizen if I live outside the UK? It is a hideously complex world out there. On the one hand, bringing in regulations to ensure that the platforms do what they are meant to, and on the other hand charging Ofcom to act as the policeman to make sure that they are actually doing it, is heaping yet more responsibility on Ofcom. The noble Lord, Lord Grade, is showing enormous stamina sitting up in the corner; he is sitting where the noble Lord, Lord Young, usually sits, which is a good way of giving the Committee a good impression.
What I would argue to the Minister is that to charge Ofcom with doing too much leads us into dangerous territory. The benefit of having a proper ombudsman who deals with these sorts of complaints week in, week out, is exactly the same argument as if one was going to have a hip or a knee replacement. Would you rather have it done by a surgical team that does it once a year or one that does it several hundred times a year? I do not know about noble Lords, but I would prefer the latter. If we had an effective ombudsman service that dealt with these platforms day in, day out, they would be the most effective individuals to identify whether or not those companies were actually doing what they are meant to do in the law, because they would be dealing with them day in, day out, and would see how they were responding. They could then liaise with Ofcom in real time to tell it if some platforms were not performing as they should. I feel that that would be more effective.
The only question I have for the Minister is whether he would please agree to meet with us between now and Report to really go into this in more detail, because this is an open goal which the Government really should be doing something to try to block. It is a bit of a no-brainer.
My Lords, the amendments in this group are concerned with complaints mechanisms. I turn first to Amendment 56 from the noble Lord, Lord Stevenson of Balmacara, which proposes introducing a requirement on Ofcom to produce an annual review of the effectiveness and efficiency of platforms’ complaints procedures. Were this review to find that regulated services were not complying effectively with their complaints procedure duties, the proposed new clause would provide for Ofcom to establish an ombudsman to provide a dispute resolution service in relation to complaints.
While I am of course sympathetic to the aims of this amendment, the Government remain confident that service providers are best placed to respond to individual user complaints, as they will be able to take appropriate action promptly. This could include removing content, sanctioning offending users, reversing wrongful content removal or changing their systems and processes. Accordingly, the Bill imposes a duty on regulated user-to-user and search services to establish and operate an easy-to-use, accessible and transparent complaints procedure. The complaints procedure must provide for appropriate action to be taken by the provider in relation to the complaint.
It is worth reminding ourselves that this duty is an enforceable requirement. Where a provider is failing to comply with its complaints procedure duties, Ofcom will be able to take enforcement action against the regulated service. Ofcom has a range of enforcement powers, including the power to impose significant penalties and confirmation decisions that can require the provider to take such steps as are required for compliance. In addition, the Bill includes strong super-complaints provisions that will allow for concerns about systemic issues to be raised with the regulator, which will be required to publish its response to the complaint. This process will help to ensure that Ofcom is made aware of issues that users are facing.
Separately, individuals will also be able to submit complaints to Ofcom. Given the likelihood of an overwhelming volume of complaints, as we have heard, Ofcom will not be able to investigate or arbitrate on individual cases. However, those complaints will be an essential part of Ofcom’s horizon-scanning, research, supervision and enforcement activity. They will guide Ofcom in deciding where to focus its attention. Ofcom will also have a statutory duty to conduct consumer research about users’ experiences in relation to regulated services and the handling of complaints made by users to providers of those services. Further, Ofcom can require that category 1, 2A and 2B providers set out in their annual transparency reports the measures taken to comply with their duties in relation to complaints. This will further ensure that Ofcom is aware of any issues facing users in relation to complaints processes.
At the same time, I share the desire expressed to ensure that the complaints mechanisms will be reviewed and assessed. That is why the Bill contains provisions for the Secretary of State to undertake a review of the efficacy of the entire regulatory framework. This will take place between two and five years after the Part 3 provisions come into force, which is a more appropriate interval for the efficacy of the duties around complaints procedures to be reviewed, as it will allow time for the regime to bed in and provide a sufficient evidence base to assess whether changes are needed.
Finally, I note that Amendment 56 assumes that the preferred solution following a review will be an ombudsman. There is probably not enough evidence to suggest that an ombudsman service would be effective for the online safety regime. It is unclear how an ombudsman service would function in support of the new online safety regime, because individual user complaints are likely to be complex and time-sensitive—and indeed, in many cases financial compensation would not be appropriate. So I fear that the noble Lord’s proposed new clause pre-empts the findings of a review with a solution that is resource-intensive and may be unsuitable for this sector.
Amendments 250A and 250B, tabled by my noble friend Lady Newlove, require that an independent appeals system is established and that Ofcom produces guidance to support this system. As I have set out, the Government believe that decisions on user redress and complaints are best dealt with by services. Regulated services will be required to operate an easy-to-use, accessible and transparent complaints procedure that enables users to make complaints. If services do not comply with these duties, Ofcom will be able to utilise its extensive enforcement powers to bring them into compliance.
The Government are not opposed to revisiting the approach to complaints once the regime is up and running. Indeed, the Bill provides for the review of the regulatory framework. However, it is important that the new approach, which will radically change the regulatory landscape by proactively requiring services to have effective systems and processes for complaints, has time to bed in before it is reassessed.
Turning specifically to the points made by my noble friend and by the noble Baroness, Lady Kidron, about the impartial out of court dispute resolution procedure in the VSP, the VSP regime and the Online Safety Bill are not directly comparable. The underlying principles of both regimes are of course the same, with the focus on systems regulation and protections for users, especially children. The key differences are regarding the online safety framework’s increased scope. The Bill covers a wider range of harms and introduces online safety duties on a wider range of platforms. Under the online safety regime, Ofcom will also have a more extensive suite of enforcement powers than under the UK’s VSP regime.
On user redress, the Bill goes further than the VSP regime as it will require services to offer an extensive and effective complaints process and will enable Ofcom to take stronger enforcement action where they fail to meet this requirement. That is why the Government have put the onus of the complaints procedure on the provider and set out a more robust approach which requires all in-scope, regulated user to user and search services to offer an effective complaints process that provides for appropriate action to be taken in relation to the complaint. This will be an enforceable duty and will enable Ofcom to utilise its extensive online safety enforcement powers where services are not complying with their statutory duty to provide a usable, accessible and transparent complaints procedure.
At the same time, we want to ensure that the regime can develop and respond to new challenges. That is why we have included a power for the Secretary of State to review the regulatory framework once it is up and running. This will provide the correct mechanism to assess whether complaint handling mechanisms can be further strengthened once the new regulations have had time to bed in.
The Government are confident that the Online Safety Bill represents a significant step forward in keeping users safe online for these reasons.
My Lords, could I just ask a question? This Bill has been in gestation for about five to six years, during which time the scale of the problems we are talking about has increased exponentially. The Government appear to be suggesting that they will, in three to five years, evaluate whether or not their approach is working effectively.
There was a lot of discussion in this Chamber yesterday about the will of the people and whether the Government were ignoring it. I gently suggest that the very large number of people, who are having all sorts of problems or who are fearful of harm from the online world, will not find in the timescale that the Government are proposing the sort of remedy and speed of action I suspect they were hoping for. Certainly, the rhetoric the Government have used and continue to use at regular points in the Bill when they are slightly on the back foot seems to be designed to try to make the situation seem better than it is.
Will the Minister and the Bill team take on board that there are some very serious concerns that there will be a lot of lashing back at His Majesty’s Government if in three years’ time—which I fear may be the case—we still have a situation where a large body of complaints are not being dealt with? Ofcom is going to suffer from major ombudsman-like constipation trying to deal with this, and the harms will continue. I think I speak for the Committee when I say that the arguments the Minister and the government side are making really do not hold water.
Considerably more rights are provided than they have today, with the service provider. Indeed, Ofcom would not necessarily deal with individual complaints—
They would go to the service provider in the first instance and then—
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberI personally think that it is pessimistic view of the future to suggest that humanity cannot rise to the task of being able to distinguish between deep fakes and real images. Organising all our lives, laws and liberties around the deviant predilections of a minority of sexual offenders on the basis that none of us will be able to tell the difference in the future, when it comes to that kind of activity, is rather dangerous for freedom and innovation.
My Lords, I will speak very briefly. I could disagree with much of what the noble Baroness just said, but I do not need to go there.
What particularly resonates with me today is that, since I first entered your Lordships’ House at the tender age of 28 in 1981, this is the first time I can ever remember us having to rein back what we are discussing because of the presence of young people in the Public Gallery. I reflect on that, because it brings home the gravity of what we are talking about and its prevalence; we cannot run away or hide from it.
I will ask the Minister about the International Regulatory Cooperation for a Global Britain: Government Response to the OECD Review of International Regulatory Cooperation of the UK, published 2 September 2020. He will not thank me for that, because I am sure that he is already familiar and word-perfect with this particular document, which was pulled together by his noble friend, the noble Lord, Lord Callanan. I raise this because, to think that we can in any way, shape or form, with this piece of legislation, stem the tide of what is happening in the online world—which is happening internationally on a global basis and at a global level—by trying to create regulatory and legal borders around our benighted island, is just for the fairies. It is not going to happen.
Can the Minister tell us about the degree to which, at an international level, we are proactively talking to, and learning from, other regulators in different jurisdictions, which are battling exactly the same things that we are? To concentrate the Minister’s mind, I will point out what the noble Lord, Lord Callanan, committed the Government to doing nearly three years ago. First, in relation to international regulatory co-operation, the Government committed to
“developing a whole-of-government IRC strategy, which sets out the policies, tools and respective roles of different departments and regulators in facilitating this; … developing specific tools and guidance to policy makers and regulators on how to conduct IRC; and … establishing networks to convene international policy professionals from across government and regulators to share experience and best practice on IRC”.
I am sure that, between now and when he responds, he will be given a detailed answer by the Bill team, so that he can tell us exactly where the Government, his department and Ofcom are in carrying out the commitments of the noble Lord, Lord Callanan.
My Lords, although I arrived a little late, I will say, very briefly, that I support the amendments wholeheartedly. I support them because I see this as a child protection issue. People viewing AI, I believe, will lead to them going out to find real children to sexually abuse. I will not take up any more time, but I wholeheartedly agree with everything that has been said, apart from what the noble Baroness, Lady Fox, said. I hope that the Minister will look very seriously at the amendments and take them into consideration.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, following on from the excellent points that the noble Baroness has made, I want to pursue the same direction. In this group of amendments we are essentially trying to reduce the incidence of tragedies such as those that the families there in the Gallery have experienced and trying to ensure that no one—that is probably unrealistic, but at least far fewer people—will have the same experience.
I particularly want to focus the Minister and the Bill team on trying to think through how to ensure that, as and when something tragic happens, what happens to the families faced with that—the experience that they have and the help that I hope in future they will be able to receive—will make it a less traumatic, lonely and baffling experience than it clearly has been to date.
At the heart of this, we are talking about communication; about the relationship between Ofcom and the platforms; probably about the relationships between platforms and other platforms, in sharing knowledge; about the relationship between Ofcom and government; about the relationship between Ofcom and regulators in other jurisdictions; and about the relationship between our Government and other Governments, including, most importantly, the Government in the US, where so many of these platforms are based. There is a network of communication that has to work. By its very nature, trying to capture something as all-encompassing as that in primary legislation will in some ways be out of date before it even hits the statute book. It is therefore incredibly important that there is a dynamic information-sharing and analytics process to understand what is going on in the online world, and what the experience is of individuals who are interacting with that world.
That brings me neatly back to an amendment that we have previously discussed, which I suspect the noble Viscount sitting on the Front Bench will remember in painful detail. When we were talking about the possibility of having an independent ombudsman to go to, what we heard from all around the House was, “Where do we go? If we have gone to the platforms and through the normal channels but are getting nowhere, where do we go? Are we on our own?”. The answer that we felt we were getting a few weeks ago was, “That’s it, you’ve got to lump it”. That is simply not acceptable.
I ask the Minister and the Bill team to ensure that there is recognition of the dynamic nature of what we are dealing with. We cannot capture it in primary legislation. I hope we cannot capture it in secondary instruments either; speaking as a member of the Secondary Legislation Scrutiny Committee, we have quite enough of them as it is so we do not want any more, thank you very much. However, it is incredibly important that the Government think about a dynamic form of having up-to-date information so that they and all the other parties in this area know what is going on.
My Lords, I support this group of amendments. I pay tribute to the families who I see are watching us as we debate this important group. I also pay tribute to my noble friend Lady Newlove, who has just given one of the most powerful speeches in the full 10 days of Committee.
The real sadness is that we are debating what happens when things go horribly wrong. I thank my noble friend the Minister and the Secretary of State, who is currently on leave, for the very collaborative way in which I know they have approached trying to find the right package—we are all waiting for him to stand up and speak to show us this. Very often, Governments do not want to give concessions early in the process of a Bill going through because they worry that those of us campaigning for concessions will then ask for more. In this case, as the noble Lord, Lord Russell, has just pointed to, all we are asking for in this Bill is to remember that a concession granted here helps only when things have gone horribly wrong.
As the noble Baroness, Lady Kidron, said, what we really want is a safer internet, where fewer children die. I reiterate the comments that she made at the end of her speech: as we have gone through Committee, we have all learned how interconnected the Bill is. It is fantastic that we will be able to put changes into it that will enable bereaved families not to have to follow the path that the Russells and all the other bereaved families campaigning for this had to follow—but that will not be enough. We also need to ensure that we put in place the safety-by-design amendments that we have been discussing. I argue that one of the most important is the one that the noble Lord, Lord Russell, has just referenced: when you already know that your child is in trouble but you cannot get help, unfortunately no one wants then to be able to say, “It’s okay. Bereaved families have what they need”. We need to do more than that.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.
On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.
Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.
Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?
The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.
One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:
“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.
Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.
We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?
My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.
Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.
Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.
My Lords, our debate on this group is on the topic of priority harms to children. It is not one that I have engaged in so I tread carefully. One reason why I have not engaged in this debate is because I have left it to people who know far more about it than I do; I have concentrated on other parts of the Bill.
In the context of this debate, one thing has come up on which I feel moved to make a short contribution: misinformation and disinformation content. There was an exchange between my noble friend Lady Harding and the noble Baroness, Lady Fox, on this issue. Because I have not engaged on the topic of priority harms, I genuinely do not have a position on what should and should not be featured. I would not want anybody to take what I say as support for or opposition to any of these amendments. However, it is important for us to acknowledge that, as much as misinformation and disinformation are critical issues—particularly for children and young people because, as the right reverend Prelate said, the truth matters—we cannot, in my view, ignore the fact that misinformation and disinformation have become quite political concepts. They get used in a way where people often define things that they do not agree with as misinformation—that is, opinions are becoming categorised as misinformation.
We are now putting this in legislation and it is having an impact on content, so it is important, too, that we do not just dismiss that kind of concern as not relevant because it is real. That is all I wanted to say.
My Lords, I will speak briefly as I know that we are waiting for a Statement.
If you talk to colleagues who know a great deal about the harm that is happening and the way in which platforms operate, as well as to colleagues who talk directly to the platforms, one thing that you commonly hear from them is a phrase that often recurs when they talk to senior people about some of the problems here: “I never thought of that before”. That is whether it is about favourites on Snapchat, which cause grief in friendship groups, about the fact that, when somebody leaves a WhatsApp group, it flags up who that person is—who wants to be seen as the person who took the decision to leave?—or about the fact that a child is recommended to other children even if the company does not know whether they are remotely similar.
If you are 13, you are introduced as a boy to Andrew Tate; if you are a girl, you might be introduced to a set of girls who may or may not share anorexia content, but they dog-whistle and blog. The companies are not deliberately orchestrating these outcomes—it is the way they are designed that is causing those consequences—but, at the moment, they take no responsibility for what is happening. We need to reflect on that.
I turn briefly to a meeting that the noble Lord, Lord Stevenson, and I were at yesterday afternoon, which leads neatly on to some of the comments the noble Baroness, Lady Fox, made, a few moments ago about the far right. The meeting was convened by Luke Pollard MP and was on the strange world known as the manosphere, which is the world of incels—involuntary celibates. As your Lordships may be aware, on various occasions, certain individuals who identify as that have committed murder and other crimes. It is a very strange world.
Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberMy Lords, as often, it is a pleasure to follow the noble Baronesses, Lady Harding and Lady Kidron, and to support this group of amendments, especially those to which I put my name. I thank the Minister and the Secretary of State for the many amendments they are introducing, including in the last group, on which I was not able to speak for similar reasons to other noble Lords. I especially note Amendment 1, which makes safety by design the object of the Bill and makes implicit the amendments that we are speaking to this afternoon, each of which is consistent with that object of safety by design running through the Bill.
As others have said, this is an immensely complex Bill, and anything which introduces clarity for the technology companies and the users is to be welcomed. I particularly welcome the list in Amendment 281F, which the noble Baroness, Lady Kidron, has already read aloud and which spells out very clearly the harm which results from functionality as well as content. It is imperative to have that in the Bill.
In Committee, I referred to the inequality of harms between the user of a service and the forces arrayed against them. You may like to imagine a child of eight, 12 or 15 using one of the many apps we are discussing this afternoon. Now imagine the five As as forces arrayed against them; they are all about functionality, not content. We must consider: the genius of the advertising industry, which is designed on a commercial basis for sales and profit; the fact that processes, applications and smartphones mean that there is 24/7 access to those who use these services and that there is no escape from them; the creation of addictions by various means of rewarding particular features, which have little to do with content and everything to do with design and function; the creative use of algorithms, which will often be invisible and undetectable to adult users and certainly invisible to children; and the creation of the generation of more harms through artificial intelligence, deep fakes and all the harms resulting from functionality. Advertising, access, addiction, algorithms and artificial intelligence are multiplying harms in a range of ways, which we have heard discussed so movingly today.
The quantity of harm means the socialisation, normalisation and creation of environments which are themselves toxic online and which would be completely unacceptable offline. I very much hope, alongside others, that the Government will give way on these amendments and build the naming of functionality and harm into the Bill.
My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.
A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.
Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.
The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.
The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.
She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.
Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.
Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:
“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—
that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.
My Lords, I rise to support the amendments in the names of the intrepid noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. They fit hand in hand with the amendments that have just been debated in the previous group. Sadly, I was unable to take part in that debate because of a technical ruling, but I thank the Minister for his kind words and thank other noble Lords for what they have said. But my heart is broken, because they included age verification, for which I have campaigned for the past 12 years, and I wanted to thank the Government for finally accepting that children need to be protected from online harmful content, pornography being one example; it is the gateway to many other harms.
My Lords, the final issue I raised in Committee is dealt with in this group on so-called proportionality. I tabled amendments in Committee to ensure that under Part 3 no website or social media service with pornographic content could argue that it should be exempt from implementing age verification under Clause 11 because to do so would be disproportionate based on its size and capacity. I am pleased today to be a co-signatory to Amendment 39 tabled by the noble Lord, Lord Bethell, to do just that.
The noble Lord, Lord Russell, and the noble Baroness, Lady Kidron, have also tabled amendments which raise similar points. I am disappointed that despite all the amendments tabled by the Minister, the issue of proportionality has not been addressed; maybe he will give us some good news on that this evening. It feels like the job is not quite finished and leaves an unnecessary and unhelpful loophole.
I will not repeat all the arguments I made in Committee in depth but will briefly recap that we all know that in the offline world, we expect consistent regulation regardless of size when it comes to protecting children. We do not allow a small corner shop to act differently from a large supermarket on the sale of alcohol or cigarettes. In a similar online scenario, we do not expect small or large gambling websites to regulate children’s access to gambling in a different way.
We know that the impact of pornographic content on children is the same whether it is accessed on a large pornographic website or a small social media platform. We know from the experience of France and Germany that pornographic websites will do all they can to evade age verification. As the noble Lord, Lord Stevenson, said on the eighth day of Committee, whether pornography
“comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along”.—[Official Report, 23/5/23; col. 821.]
By not shutting off the proportionality argument, the Government are allowing different-sized online services to act differently on pornography and all the other primary priority content, as I raised in Committee. At that stage, the noble Baroness, Lady Kidron, said,
“we do not need to take a proportionate approach to pornography”.—[Official Report, 2/5/23; col. 1481.]
Amendment 39 would ensure that pornographic content is treated as a separate case with no loopholes for implementing age verification based on size and capacity. I urge the Minister to reflect on how best we can close this potential loophole, and I look forward to his concluding remarks.
My Lords, I will briefly address Amendments 43 and 87 in my name. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to these amendments. They are complementary to the others in this group, on which the noble Lord, Lord Bethell, and the noble Baroness, Lady Ritchie, have spoken.
In Committee the Minister argued that it would be unfair to place the same child safety duties across all platforms. He said:
“This provision recognises that what it is proportionate to require of providers at either end of that scale will be different”.—[Official Report, 2/5/23; col. 1443.]
Think back to the previous group of amendments we debated. We talked about functionality and the way in which algorithms drive these systems. They drive you in all directions—to a large platform with every bell and whistle you might anticipate because it complies with the legislation, but also, willy-nilly, without any conscious thought because that is how it is designed, to a much smaller site. If we do not amend the legislation as it stands, they will take you to smaller sites that do not require the same level of safety duties, particularly towards children. I think we all fail to understand the logic behind that argument.