19 Lord Bishop of Oxford debates involving the Department for Digital, Culture, Media & Sport

Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Wed 1st Feb 2023
Fri 10th Dec 2021

Online Safety Bill

Lord Bishop of Oxford Excerpts
I am willing to accept that the amendments that I put my name to and that the noble Baroness, Lady Kidron, introduced so powerfully might not be the best way to do this. We might well have unintentionally fallen on to a landmine in this complex Bill. But I cannot accept that it is not necessary to put it in the Bill, so I urge my noble friend the Minister to accept the principle behind these amendments. If he cannot accept them today, I ask him to firmly commit to bring back government amendments that put non-content harms in the Bill. Otherwise, I will need to follow the noble Baroness, Lady Kidron, through the Lobbies.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, as often, it is a pleasure to follow the noble Baronesses, Lady Harding and Lady Kidron, and to support this group of amendments, especially those to which I put my name. I thank the Minister and the Secretary of State for the many amendments they are introducing, including in the last group, on which I was not able to speak for similar reasons to other noble Lords. I especially note Amendment 1, which makes safety by design the object of the Bill and makes implicit the amendments that we are speaking to this afternoon, each of which is consistent with that object of safety by design running through the Bill.

As others have said, this is an immensely complex Bill, and anything which introduces clarity for the technology companies and the users is to be welcomed. I particularly welcome the list in Amendment 281F, which the noble Baroness, Lady Kidron, has already read aloud and which spells out very clearly the harm which results from functionality as well as content. It is imperative to have that in the Bill.

In Committee, I referred to the inequality of harms between the user of a service and the forces arrayed against them. You may like to imagine a child of eight, 12 or 15 using one of the many apps we are discussing this afternoon. Now imagine the five As as forces arrayed against them; they are all about functionality, not content. We must consider: the genius of the advertising industry, which is designed on a commercial basis for sales and profit; the fact that processes, applications and smartphones mean that there is 24/7 access to those who use these services and that there is no escape from them; the creation of addictions by various means of rewarding particular features, which have little to do with content and everything to do with design and function; the creative use of algorithms, which will often be invisible and undetectable to adult users and certainly invisible to children; and the creation of the generation of more harms through artificial intelligence, deep fakes and all the harms resulting from functionality. Advertising, access, addiction, algorithms and artificial intelligence are multiplying harms in a range of ways, which we have heard discussed so movingly today.

The quantity of harm means the socialisation, normalisation and creation of environments which are themselves toxic online and which would be completely unacceptable offline. I very much hope, alongside others, that the Government will give way on these amendments and build the naming of functionality and harm into the Bill.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.

A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.

Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.

The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.

The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.

She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.

Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.

Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:

“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—


that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.

Online Safety Bill

Lord Bishop of Oxford Excerpts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I just want to elucidate whether the Minister has any kind of brief on my Amendment 152A. I suspect that he does not; it is not even grouped—it is so recent that it is actually not on today’s groupings list. However, just so people know what will be coming down the track, I thought it would be a good idea at this stage to say that it is very much about exactly the question that the noble Baroness, Lady Harding, was asking. It is about the interaction between a provider environment and a user, with the provider environment being an automated bot—or “tool”, as my noble friend may prefer.

It seems to me that we have an issue here. I absolutely understand what the Minister has done, and I very much support Amendment 153, which makes it clear that user-generated content can include bots. But this is not so much about a human user using a bot or instigating a bot; it is much more about a human user encountering content that is generated in an automated way by a provider, and then the user interacting with that in a metaverse-type environment. Clearly, the Government are apprised of that with regard to Part 5, but there could be a problem as regards Part 3. This is an environment that the provider creates, but it is interacted with by a user as if that environment were another user.

I shall not elaborate or make the speech that I was going to make, because that would be unfair to the Minister, who needs to get his own speaking note on this matter. But I give him due warning that I am going to degroup and raise this later.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, I warmly welcome this group of amendments. I am very grateful to the Government for a number of amendments that they are bringing forward at this stage. I want to support this group of amendments, which are clearly all about navigating forward and future-proofing the Bill in the context of the very rapid development of artificial intelligence and other technologies. In responding to this group of amendments, will the Minister say whether he is now content that the Bill is sufficiently future-proofed, given the hugely rapid development of technology, and whether he believes that Ofcom now has sufficient powers to risk assess for the future and respond, supposing that there were further parallel developments in generative AI such as we have seen over the past year?

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, this is a quick-fire debate on matters where most of us probably cannot even understand the words, let alone the purpose and particularity of the amendments. I want to raise points already raised by others: it seems that the Government’s intention is to ensure that the Bill is future-proofed. Why then are they restricting this group to Part 5 only? It follows that, since Part 5 is about pornography, it has to be about only pornography—but it is rather odd that we are not looking at the wider context under which harm may occur, involving things other than simply pornography. While the Bill may well be currently able to deal with the issues that are raised in Part 3 services, does it not need to be extended to that as well? I shall leave it at that. The other services that we have are probably unlikely to raise the sorts of issues of concern that are raised by this group. None the less, it is a point that we need reassurance on.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also speak in support of Amendments to 281, 281A and 281B, to which I have added my name, tabled by the noble Lord, Lord Russell. He and, as ever, the noble Baroness Kidron, have spoken eloquently, I am not going to spend much time on these amendments but I wanted to emphasise Amendment 281A.

In the old world of direct marketing—I am old enough to remember that when I was a marketing director it was about sending magazines, leaflets and letters—one spent all of one’s time working out how to build loyalty: how to get people to engage longer as a result of one’s marketing communication. In the modern digital world, that dwell time has been transformed into a whole behavioural science of its own. It has developed a whole set of tools. Today, we have been using the word “activity” at the beginning of the Bill in the new Clause 1 but also “features” and “functionality”. The reason why Amendment 281A is important is that there is a danger that the Bill keeps returning to being just about content. Even in Clause 208 on functionality, almost every item in subsection (2) mentions content, whereas Amendment 281A tries to spell out the elements of addiction-driving functionality that we know exist today.

I am certain that brilliant people will invent some more but we know that these ones exist today. I really think that we need to put them in the Bill to help everyone understand what we mean because we have spent days on this Bill—some of us have spent years, if not decades, on this issue—yet we still keep getting trapped in going straight back to content. That is another reason why I think it is so important that we get some of these functionalities in the Bill. I very much hope that, if he cannot accept the amendment today, my noble friend the Minister will go back, reflect and work out how we could capture these specific functionalities before it is too late.

I speak briefly on Amendments 28 to 30. There is unanimity of desire here to make sure that organisations such as Wikipedia and Streetmap are not captured. Personally, I am very taken—as I often am—by the approach of the noble Baroness, Lady Kidron. We need to focus on risk rather than using individual examples, however admirable they are today. If Wikipedia chose to put on some form of auto-scroll, the risk of that service would go up; I am not suggesting that Wikipedia is going to do so today but, in the digital world, we should not assume that, just because organisations are charities or devoted to the public good, they cannot inadvertently cause harm. We do not make that assumption in the physical world either. Charities that put on physical events have to do physical risk assessments. I absolutely think that we should hold all organisations to that same standard. However, viewed through the prism of risk, Wikipedia—brilliant as it is—does not have a risk for child safety and therefore should not be captured by the Bill.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, I broadly support all the amendments in this group but I will focus on the three amendments in the names of the noble Lord, Lord Russell, and others; I am grateful for their clear exposition of why these amendments are important. I draw particular attention to Amendment 281A and its helpful list of functions that are considered to be harmful and to encourage addiction.

There is a very important dimension to this Bill, whose object, as we have now established, is to encourage safety by design. An important aspect of it is cleaning up, and setting right, 20 years or more of tech development that has not been safe by design and has in fact been found to be harmful by way of design. As the noble Baroness, Lady Harding, just said, in many conversations and in talking to people about the Bill, one of the hardest things to communicate and get across is that this is about not only content but functionality. Amendment 281A provides a useful summary of the things that we know about in terms of the functions that cause harm. I add my voice to those encouraging the Minister and the Government to take careful note of it and to capture this list in the text of the Bill in some way so that this clean-up operation can be about not only content for the future but functionality and can underline the objectives that we have set for the Bill this afternoon.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I start by saying amen—not to the right reverend Prelate but to my noble friend Lady Harding. She said that we should not assume that, just because charities exist, they are all doing good; as a former chair of the Charity Commission, I can say that that is very true.

The sponsors of Amendments 281 to 281B have made some powerful arguments in support of them. They are not why I decided to speak briefly on this group but, none the less, they made some strong points.

I come back to Amendments 28 to 30. Like others, I do not have a particular preference for which of the solutions is proposed to address this problem but I have been very much persuaded by the various correspondence that I have received—I am sure that other noble Lords have received such correspondence—which often uses Wikipedia as the example to illustrate the problem.

However, I take on board what my noble friend said: there is a danger of identifying one organisation and getting so constrained by it that we do not address the fundamental problems that the Bill is about, which is making sure that there is a way of appropriately excluding organisations that should not be subject to these various regulations because they are not designed for them. I am open to the best way of doing that.

--- Later in debate ---
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, I too welcome these amendments and thank the Minister and the Government for tabling them. The Bill will be significantly strengthened by Amendment 172 and related amendments by putting the harms as so clearly described in the Bill. I identify with the comments of others that we also need to look at functionality. I hope we will do that in the coming days.

I also support Amendment 174, to which I added my name. Others have covered proposed new subsection (9B) very well; I add my voice to those encouraging the Minister to give it more careful consideration. I will also speak briefly to proposed new subsection (9A), on misinformation and disinformation content. With respect to those who have spoken against it and argued that those are political terms, I argue that they are fundamentally ethical terms. For me, the principle of ethics and the online world is not the invention of new ethics but finding ways to acknowledge and support online the ethics we acknowledge in the offline world.

Truth is a fundamental ethic. Truth builds trust. It made it into the 10 commandments:

“You shall not bear false witness against your neighbour”.


It is that ethic that would be translated across in proposed new subsection (9A). One of the lenses through which I have viewed the Bill throughout is the lens of my eight grandchildren, the oldest of whom is eight years old and who is already using the internet. Proposed new subsection (9A) is important to him because, at eight years old, he has very limited ways of checking out what he reads online—fewer even than a teenager. He stands to be fundamentally misled in a variety of ways if there is no regulation of misinformation and disinformation.

Also, the internet, as we need to keep reminding ourselves in all these debates, is a source of great potential good and benefit, but only if children grow up able to trust what they read there. If they can trust the web’s content, they will be able to expand their horizons, see things from the perspective of others and delve into huge realms of knowledge that are otherwise inaccessible. But if children grow up necessarily imbued with cynicism about everything they read online, those benefits will not accrue to them.

Misinformation and disinformation content is therefore harmful to the potential of children across the United Kingdom and elsewhere. We need to guard against it in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.

My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.

As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.

There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.

To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.

The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.

The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.

Online Safety Bill

Lord Bishop of Oxford Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, on day eight of Committee, I feel that we have all found our role. Each of us has spoken in a similar vein on a number of amendments, so I will try to be brief. As the noble Lord, Lord Allan, has spoken from his experience, I will once again reference my experience as the chief executive, for seven years, of a business regulated by Ofcom; as the chair of a regulator; and as someone who sat on the court of, arguably, the most independent of independent regulators, the Bank of England, for eight years.

I speak in support of the amendments in the name of my noble friend Lady Stowell, because, as a member of the Communications and Digital Committee, my experience, both of being regulated and as a regulator, is that independent regulators might be independent in name—they might even be independent in statute—but they exist in the political soup. It is tempting to think that they are a sort of granite island, completely immovable in the political soup, but they are more like a boat bobbing along in the turbulence of politics.

As the noble Lord, Lord Allan, has just described, they are influenced both overtly and subtly by the regulated companies themselves—I am sure we have both played that game—by politicians on all sides, and by the Government. We have played these roles a number of times in the last eight days; however, this is one of the most important groups of amendments, if we are to send the Bill back in a shape that will really make the difference that we want it to. This group of amendments challenges whether we have the right assignment of responsibility between Parliament, the regulator, government, the regulated and citizens.

It is interesting that we—every speaker so far—are all united that the Bill, as it currently stands, does not get that right. To explain why I think that, I will dwell on Amendment 114 in the name of my noble friend Lady Stowell. The amendment would remove the Secretary of State’s ability to direct Ofcom to modify a draft of the code of practice “for reasons of public policy”. It leaves open the ability to direct in the cases of terrorism, child sexual abuse, national security or public safety, but it stops the Secretary of State directing with regard to public policy. The reason I think that is so important is that, while tech companies are not wicked and evil, they have singularly failed to put internet safety, particularly child internet safety, high enough up their pecking order compared with delivering for their customers and shareholders. I do not see how a Secretary of State will be any better at that.

Arguably, the pressures on a Secretary of State are much greater than the pressures on the chief executives of tech companies. Secretaries of State will feel those pressures from the tech companies and their constituents lobbying them, and they will want to intervene and feel that they should. They will then push that bobbing boat of the independent regulator towards whichever shore they feel they need to in the moment—but that is not the way you protect people. That is not the way that we treat health and safety in the physical world. We do not say, “Well, maybe economics is more important than building a building that’s not going to fall down if we have a hurricane”. We say that we need to build safe buildings. Some 200 years ago, we were having the same debates about the physical world in this place; we were debating whether you needed to protect children working in factories, and the consequences for the economics. Well, how awful it is to say that today. That is the reality of what we are saying in the Bill now: that we are giving the Secretary of State the power to claim that the economic priority is greater than protecting children online.

I am starting to sound very emotional because at the heart of this is the suggestion that we are not taking the harms seriously enough. If we really think that we should be giving the Secretary of State the freedom to direct the regulator in such a broad way, we are diminishing the seriousness of the Bill. That is why I wholeheartedly welcome the remark from the noble Lord, Lord Stevenson, that he intends to bring this back with the full force of all of us across all sides of the Committee, if we do not hear some encouraging words from my noble friend the Minister.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, it is pleasure to follow the noble Baroness, Lady Harding, whose very powerful speech took us to the heart of the principles behind these amendments. I will add my voice, very briefly, to support the amendments for all the key reasons given. The regulator needs to be independent of the Secretary of State and seen to be so. That is the understandable view of the regulator itself, Ofcom; it was the view of the scrutiny committee; and it appears to be the view of all sides and all speakers in this debate. I am also very supportive of the various points made in favour of the principle of proper parliamentary scrutiny of the regulator going forward.

One of the key hopes for the Bill, which I think we all share, is that it will help set the tone for the future global conversation about the regulation of social media and other channels. The Government’s own impact assessment on the Bill details parallel laws under consideration in the EU, France, Australia, Germany and Ireland, and the noble Viscount, Lord Colville, referred to standards set by UNESCO. The standards set in the OSB at this point will therefore be a benchmark across the world. I urge the Government to set that benchmark at the highest possible level for the independence and parliamentary oversight of the regulator.

--- Later in debate ---
To sum up, I support a clearer definition of age assurance and age verification for pornography, and a six-month implementation deadline—we all know what happened with the repeated delays in implementing Part 3 of the Digital Economy Act 2017. We need performer age checks and quicker enforcement, without the need to go into court, and most of all, to cover all porn with the same regulation, whether on social media or on dedicated sites. I urge the Government to accept all these amendments. I look forward to receiving the Minister’s assurances that this regulation of online pornographic content will be included within the scope of the Online Safety Bill. We need to show our children that we truly care.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, it is such a privilege to follow the noble Baroness, Lady Benjamin. I pay tribute to her years of campaigning on this issue and the passion with which she spoke today. It is also a privilege to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, in supporting all the amendments in this group. They are vital to this Bill, as all sides of this Committee agree. They all have my full support.

When I was a child, my grandparents’ home, like most homes, was heated by a coal fire. One of the most vital pieces of furniture in any house where there were children in those days was the fireguard. It was there to prevent children getting too near to the flame and the smoke, either by accident or by design. It needed to be robust, well secured and always in position, to prevent serious physical harm. You might have had to cut corners on various pieces of equipment for your house, but no sensible family would live without the best possible fireguard they could find.

We lack any kind of fireguard at present and the Bill currently proposes an inadequate fireguard for children. A really important point to grasp on this group of amendments is that children cannot be afforded the protections that the Bill gives them unless they are identified as children. Without that identification, the other protections fail. That is why age assurance is so foundational to the safety duties and mechanisms in the Bill. Surely, I hope, the Minister will acknowledge both that we have a problem and that the present proposals offer limited protection. We have a faulty fireguard.

These are some of the consequences. Three out of five 11 to 13 year-olds have unintentionally viewed pornography online. That is most of them. Four out of five 12 to 15 year-olds say they have had a potentially harmful experience online. That is almost universal. Children as young as seven are accessing pornographic content and three out of five eight to 11 year-olds—you might want to picture a nine year-old you know—have a social media profile, when they should not access those sites before the age of 13. That profile enables them to view adult content. The nation’s children are too close to the fire and are being harmed.

There is much confusion about what age assurance is. As the noble Baroness, Lady Kidron, has said, put simply it is the ability to estimate or verify an individual’s age. There are many different types of age assurance, from facial recognition to age verification, which all require different levels of information and can give varying levels of assurance. At its core, age assurance is a tool which allows services to offer age-appropriate experiences to their users. The principle is important, as what might be appropriate for a 16 year-old might be inappropriate for a 13 year-old. That age assurance is absolutely necessary to give children the protections they deserve.

Ofcom’s research shows that more than seven out of 10 parents of children aged 13 to 17 were concerned about their children seeing age-inappropriate content or their child seeing adult or sexual content online. Every group I have spoken to about the Bill in recent months has shared this concern. Age assurance would enable services to create age-appropriate experiences for children online and can help prevent children’s exposure to this content. The best possible fireguard would be in place.

Different levels of age assurance are appropriate in different circumstances. Amendments 161 and 142 establish that services which use age assurance must do so in line with the basic rules of the road. They set out that age assurance must be proportionate to the level of risk of a service. For high-risk services, such as pornography, sites much establish the age of their users beyond reasonable doubt. Equally, a service which poses no risk may not need to use age assurance or may use a less robust form of age assurance to engage with children in an age-appropriate manner—for example, serving them the terms and conditions in a video format.

As has been said, age assurance must be privacy-preserving. It must not be used as an excuse for services to use the most intrusive technology for data-extractive purposes. These are such common-sense amendments, but vital. They will ensure that children are prevented from accessing the most high-risk sites, enable services to serve their users age-appropriate experiences, and ensure that age assurance is not used inappropriately in a way that contravenes a user’s right to privacy.

As has also been said, there is massive support for this more robust fireguard in the country at large, across this House and, I believe, in the other place. I have not yet been able to understand, or begin to understand, the Government’s reasons for not providing the best protection for our children, given the aim of the Bill. Better safeguards are technically possible and eminently achievable. I would be grateful if the Minister could attempt to explain what exactly he and the Government intend to do, given the arguments put forward today and the ongoing risks to children if these amendments are not adopted.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the right reverend Prelate the Bishop of Oxford. He used an interesting analogy of the fireguard; what we want in this legislation is a strong fireguard to protect children.

Amendments 183ZA and 306 are in my name, but Amendment 306 also has the name of the noble Lord, Lord Morrow, on it. I want to speak in support of the general principles raised by the amendments in this group, which deal with five specific areas, namely: the definition of pornography; age verification; the consent of those participating in pornographic content; ensuring that content which is prohibited offline is also prohibited online; and the commencement of age verification. I will deal with each of these broad topics in turn, recognising that we have already dealt with many of the issues raised in this group during Committee.

As your Lordships are aware, the fight for age verification has been a long one. I will not relive that history but I remind the Committee that when the Government announced in 2019 that they would not implement age verification, the Minister said:

“I believe we can protect children better and more comprehensively through the online harms agenda”.—[Official Report, Commons, 17/10/19; col. 453.]


Four years later, the only definition for pornography in the Bill is found in Clause 70(2). It defines pornographic content as

“produced solely or principally for the purpose of sexual arousal”.

I remain to be convinced that this definition is more comprehensive than that in the Digital Economy Act 2017.

Amendment 183ZA is a shortened version of the 2017 definition. I know that the Digital Economy Act is out of vogue but it behoves us to have a debate about the definition, since what will be considered as pornography is paramount. If we get that wrong, age verification will be meaningless. Everything else about the protections we want to put in place relies on a common understanding of when scope of age verification will be required. Put simply, we need to know what it is we are subjecting to age verification and it needs to be clear. The Minister stated at Second Reading that he believed the current definition is adequate. He suggested that it ensured alignment across different pieces of legislation and other regulatory frameworks. In reviewing other legislation, the only clear thing is this: there is no standard definition of pornography across the legislative framework.

For example, Section 63 of the Criminal Justice and Immigration Act 2008 uses the definition in the Bill, but it requires a further test to be applied: meeting the definition of “extreme” material. Section 368E of the Communications Act 2003 regulates online video on demand services. That definition uses the objective tests of “prohibited material”, meaning material too extreme to be classified by the British Board of Film Classification, and “specially restricted material”, covering R18 material, while also using a subjective test that covers material that

“might impair the physical, mental or moral development”

of under-18s.

Online Safety Bill

Lord Bishop of Oxford Excerpts
Baroness Parminter Portrait Baroness Parminter (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Fox. I am afraid that on this issue, as I am sure she would expect, we profoundly disagree. I am delighted to support the amendment of the noble Baroness, Lady Morgan, and those from my noble friend Lord Clement-Jones, which do the same sort of thing and address the critical issue of what is a proportionate response, respecting the fact that the position for adults is different from that for children. What is a proportionate response, recognising that there is a large cadre of vulnerable people who need help to manage the beneficial but also worrying tool which is social media?

I shall cover only the issues on which I have any degree of competence in this complex field, which is to speak about the importance of this amendment because of the particular nature of eating disorders. I declare an interest as the mother of a young adult who has eating disorders and had them when she was a child. The noble Baroness, Lady Fox, talked about the need to allow adults to use their reason. Let me tell the Committee about people with eating disorders: I would love it if I could get my daughter to be as reasonable as she is when I talk to her about the benefits of proportional representation, where she can beat me hands down, when I try but fail to get her to put food in her mouth.

Eating disorders have two issues of relevance to this debate, and they are why I support the case for the strongest protection for them, the default being that people should have to opt in to have access to harmful content. First, eating disorders are intensely controlling. They suck people in, and they are not just about not eating; they control how they exercise; they control who they see; they are a control mechanism over a person’s whole life. I reject the idea that you can get someone who is controlled, day and night, by an eating disorder to make the decision to opt out of accessing social media content, when we know that people with eating disorders gravitate towards it because it provides them with content that sustains their illness. It provides them with communities of other users— the pro-mia and pro-ana sites, which sound incredibly comforting but are actually communities of people that encourage people, sometimes literally, to starve themselves to death. That controlling nature means that, for me, people having to opt in is the best way forward: it is a controlling illness.

Secondly, eating disorders are a very competitive illness. If you have anorexia, you want to be the thinnest. In the old days, that meant that you would cook food that you would not eat, but you would get your sister to eat it and you would feel good because you were thinner. Of course, with social media, you can now access all these websites where you can see people with nasogastric tubes and see people who are doing much “better”. As the noble Baroness, Lady Morgan, said, in that dreadful phrase, they provide “thinspiration”: people look for thinness and compare themselves to other people. It is an insatiable desire, so the idea that they will voluntarily opt out of that is just away with the fairies.

As I say, we need a proportionate response. I appreciate that people with eating disorders may well choose to opt in, but I think that the state in the first place should require that people have to opt into that choice. We have heard about the various mental health organisations that have made that case, but in thinking about this and talking to Rose about it, I think there is another fundamental reason why it is right that the state should take this approach. As the noble Baroness, Lady Morgan, said, eating disorders can start at a young age, but they can also start after the age of 18. If someone in their mid-20s—or mid-30s or mid-40s—is starting to feel a bit uncomfortable about their body image and starting to get some rather odd views about food but does not yet have an eating disorder, that is the time when, if they get support and do not get encouragement, we might be able to stop them getting sucked into these appalling vortexes of eating disorders. If we have this provision that people have to opt in, they might not see that content which, as has been mentioned, is being pushed at them—the right reverend Prelate the Bishop of Oxford gave examples the other week of how these sites feed you stuff immediately as soon as you start going down this route. If people have to opt in, we might just have that chance of stopping them getting an eating disorder.

Yes, people have to be given access to some of this material in a free society, but it is the role of the state to protect the vulnerable, and the particular nature of eating disorders means that, for me, this amendment is vital.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, it is a privilege to follow the noble Baroness, Lady Parminter, in her very moving and personal speech. I am sorry that I was unable to speak to the previous group of amendments, some of which were in my name, because, due to unavoidable business in my diocese, I was not able to be present when that debate began late last Tuesday. However, it is very good to be able to support this group of amendments, and I hope tangentially to say something also in favour of risk assessment, although I am conscious that other noble Lords have ably made many of the points that I was going to make.

My right reverend friend the Bishop of Gloucester has added her name in support of amendments in this group, and I also associate myself with them—she is not able to be here today. As has been said, we are all aware that reaching the threshold of 18 does not somehow award you with exponentially different discernment capabilities, nor wrap those more vulnerable teenagers in some impermeable cotton wool to protect them from harm.

We are united, I think, in wanting to do all we can to make the online space feel safe and be safe for all. However, there is increasing evidence that people do not believe that it is. The DCMS’s own Public Attitudes to Digital Regulation survey is concerning. The most recent data shows that the number of UK adults who do not feel safe and secure online increased from 38% in November/December 2021 to 45% in June/July 2022. If that trend increases, the number will soon pass half, with more than half of UK adults not feeling safe and secure online.

It is vital that we protect society’s most vulnerable. When people are vulnerable through mental illness or other challenges, they are surely not able to protect themselves from being exposed to damaging online content by making safe choices, as we have just heard. In making this an opt-in system, we would save lives when people are at a point of crisis.

Online Safety Bill

Lord Bishop of Oxford Excerpts
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I have listened intently today, and there is no doubt that this Bill not only presents many challenges but throws up the complexity of the whole situation. I think it was the noble Lord, Lord Kamall, in an earlier group who raised the issues of security, safety and freedom. I would add the issue of rights, because we are trying to balance all these issues and characterise them in statute, vis-à-vis the Bill.

On Tuesday, we spoke about one specific harm—pornography—on the group of amendments that I had brought forward. But I made clear at that time that I believe this is not the only harm, and I fully support the principles of the amendments from the noble Baroness, Lady Kidron. I would obviously like to get some clarity from her on the amendments, particularly as to how they relate to other clauses in the Bill.

The noble Baroness has been the pioneer in this field, and her expertise is well recognised across the House. I believe that these amendments really take us to the heart of the Bill and what we are trying to achieve—namely, to identify online harms to children, counteract them and provide a level of safety to young people.

As the noble Lord, Lord Clement-Jones, said on Tuesday,

“there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us”.—[Official Report, 25/4/23; col. 1196.]

There is actually not that much between us. I fully agree with the principle of putting some of the known harms to children in the Bill. If we know the harms, there is little point in waiting for them to be defined in secondary legislation by Clause 54.

It is clear to me that there are harms to children that we know about, and those harms will not change. It would be best to name those harms clearly in the Bill when it leaves this House. That would allow content providers, search engines and websites in scope of the Bill to prepare to make any changes they need to keep children safe. Perhaps the Minister could comment on that aspect. We also know that parents will expect some harms to be in the Bill. The noble Baroness, Lady Kidron, laid out what they are, and I agree with her analysis. These issues are known and we should not wait for them to be named.

While known harms should be placed into the Bill, I know, understand and appreciate that the Government are concerned about future-proofing. However, I am of the view that a short list of key topics will not undermine that principle. Indeed, the Joint Committee’s report on the draft Bill stated,

“we recommend that key, known risks of harm to children are set out on the face of the Bill”.

In its report on the Bill, the DCMS Select Committee in the other place agreed, saying

“that age-inappropriate or otherwise inherently harmful content and activity (like pornography, violent material, gambling and content that promotes or is instructive in eating disorders, self-harm and suicide) should appear on the face of the Bill”.

Has there been any further progress in discussions on those issues?

At the beginning of the year, the Children’s Commissioner urged Parliamentarians

“to define pornography as a harm to children on the fact of the … Bill, such that the regulator, Ofcom, may implement regulation of platforms hosting adult content as soon as possible following the passage of the Bill”.

I fully agree with the Children’s Commissioner. While the ways in which pornographic content is delivered will change over time, the fact that pornography is harmful to children will not change. Undoubtedly, with the speed of technology—something that the noble Lord, Lord Allan of Hallam, knows a lot more about than the rest of us, having worked in this field—it will no doubt change and we will be presented with new types of challenges.

I therefore urge the Government to support the principle that the key risks are in the Bill, and I thank the noble Baroness, Lady Kidron, for raising this important principle. However, I hope she will indulge me as I seek to probe some of the detail of her amendments and their interactions with the architecture of other parts of the Bill. As I said when speaking to Clause 49 on Tuesday, the devil is obviously in the detail.

First, Clause 54 defines what constitutes

“Content that is harmful to children”,

and Clause 205 defines harm, and Amendment 93 proposes an additional new list of harms. As I have already said, I fully support the principle of harms being in the Bill, but I raise a question for the noble Baroness. How does she see these three definitions working together? That might refer back to a preliminary discussion that we had in the tearoom earlier.

These definitions of harms are in addition to the content to be defined as primary priority content and priority content. Duties in Clauses 11 and 25 continue to refer to these two types of content for Part 3 services, but Amendments 20 and 74 would remove the need for risk assessments in Clauses 10 and 24 to address these two types of content. It seems that the amendments could create a tension in the Bill, and I am interested to ascertain how the noble Baroness, Lady Kidron, foresees that tension operating. Maybe she could give us some detail in her wind-up about that issue. An explanation of that point may bring some clarity to understanding how the new schedule that the noble Baroness proposes will work alongside the primary priority content and the priority content lists. Will the schedule complement primary priority content, or will it be an alternative?

Secondly, as I said, some harms are known but there are harms that are as yet unknown. Will the noble Baroness, Lady Kidron, consider a function to add to the list of content in her Amendment 93, in advance of us coming back on Report? There is no doubt that the online space is rapidly changing, as this debate has highlighted. I can foresee a time when other examples of harm should be added to the Bill. I accept that the drafting is clear that the list is not exclusive, but it is intended to be a significant guide to what matters to the public and Parliament. I also accept that Ofcom can provide guidance on other content under Amendment 123, but, without a regulatory power added to Amendment 93, it feels that we are perhaps missing a belt-and-braces approach to online harms to children. After all, our principal purpose here is to protect children from online harm.

I commend the noble Baroness, Lady Kidron, on putting these important amendments before the Committee, and I fully support the principle of what she seeks to achieve. But I hope that, on further reflection, she will look at the points I have suggested. Perhaps she might suggest other ideas in her wind-up, and we could have further discussions in advance of Report. I also look forward to the Minister’s comments on these issues.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, I support Amendments 20, 93 and 123, in my name and those of the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Stevenson. I also support Amendment 74 in the name of the noble Baroness, Lady Kidron. I pay tribute to the courage of all noble Lords and their teams, and of the Minister and the Bill team, for their work on this part of the Bill. This work involves the courage to dare to look at some very difficult material that, sadly, shapes the everyday life of too many young people. This group of amendments is part of a package of measures to strengthen the protections for children in the Bill by introducing a new schedule of harms to children and plugging a chronological gap between Part 3 and Part 5 services, on when protection from pornography comes into effect.

Every so often in these debates, we have been reminded of the connection with real lives and people. Yesterday evening, I spent some time speaking on the telephone with Amanda and Stuart Stephens, the mum and dad of Olly Stephens, who lived in Reading, which is part of the diocese of Oxford. Noble Lords will remember that Olly was tragically murdered, aged 13, in a park near his home, by teenagers of a similar age. Social media played a significant part in the investigation and in the lives of Olly and his friends—specifically, social media posts normalising knife crime and violence, with such a deeply tragic outcome.

Online Safety Bill

Lord Bishop of Oxford Excerpts
However, the reality is that if we do not look at the impact of the digital world on every child, then we are adopting a different standard in the digital world than we do in the physical world. That is why the “likely to be accessed by children” definition that has been tried and tested, not just in this House but in legislatures around the world, should be what is used in this Bill.
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the two noble Baronesses. I remind the Committee of my background as a board member of the Centre for Data Ethics and Innovation. I also declare an indirect interest, as my oldest son is the founder and studio head of Mediatonic, which is now part of Epic Games and is the maker of “Fall Guys”, which I am sure is familiar to your Lordships.

I speak today in support of Amendments 2 and 92 and the consequent amendments in this group. I also support the various app store amendments proposed by the noble Baroness, Lady Harding, but I will not address them directly in these remarks.

I was remarkably encouraged on Wednesday by the Minister’s reply to the debate on the purposes of the Bill, especially by the priority that he and the Government gave to the safety of children as its primary purpose. The Minister underlined this point in three different ways:

“The main purposes of the Bill are: to give the highest levels of protection to children … The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children … Children’s safety is prioritised throughout this Bill”.—[Official Report, 19/4/23; col. 724.]


The purpose of Amendments 2 and 92 and consequent amendments is to extend and deepen the provisions in the Bill to protect children against a range of harms. This is necessary for both the present and the future. It is necessary in the present because of the harms to which children are exposed through a broad range of services, many of which are not currently in the Bill’s scope. Amendment 2 expands the scope to include any internet service that meets the child user condition and enables or promotes harmful activity and content as set out in the schedule provided. Why would the Government not take this step, given the aims and purposes of the Bill to give the highest protection to children?

Every day, the diocese of Oxford educates some 60,000 children in our primary and secondary schools. Almost all of them have or will have access to a smartphone, either late in primary, hopefully, or early in secondary school. The smartphone is a wonderful tool to access educational content, entertainment and friendship networks, but it is also a potential gateway for companies, children and individuals to access children’s inner lives, in secret, in the dead of night and without robust regulation. It therefore exposes them to harm. Sometimes that harm is deliberate and sometimes unintentional. This power for harm will only increase in the coming years without these provisions.

The Committee needs to be alert to generational changes in technology. When I was 16 in secondary school in Halifax, I did a computer course in the sixth form. We had to take a long bus ride to the computer building in Huddersfield University. The computer filled several rooms in the basement. The class learned how to program using punch cards. The answers to our questions came back days later, on long screeds of printed paper.

When my own children were teenagers and my oldest was 16, we had one family computer in the main living room of the house. The family was able to monitor usage. Access to the internet was possible, but only through a dial-up modem. The oldest of my grandchildren is now seven and many of his friends have smartphones now. In a few years, he will certainly carry a connected device in his pocket and, potentially, have access to the entire internet 24/7.

I want him and millions of other children to have the same protection online as he enjoys offline. That means recognising that harms come in a variety of shapes and sizes. Some are easy to spot, such as pornography. We know the terrible damage that porn inflicts on young lives. Some are more insidious and gradual: addictive behaviours, the promotion of gambling, the erosion of confidence, grooming, self-harm and suicidal thoughts, encouraging eating disorders, fostering addiction through algorithms and eroding the barriers of the person.

The NSPCC describes many harms to children on social networks that we are all now familiar with, but it also highlights online chat, comments on livestream sites, voice chat in games and private messaging among the vectors for harm. According to Ofcom, nine in 10 children in the UK play video games, and they do so on devices ranging from computers to mobile phones to consoles. Internet Matters says that most children’s first interaction with someone they do not know online is now more likely to be in a video game such as “Roblox” than anywhere else. It also found that parents underestimate the frequency with which their children are contacted by strangers online.

The Gambling Commission has estimated that 25,000 children in the UK aged between 11 and 16 are problem gamblers, with many of them introduced to betting via computer games and social media. Families have been left with bills, sometimes of more than £3,000, after uncontrolled spending on loot boxes.

Online companies, we know, design their products with psychological principles of engagement firmly in view, and then refine their products by scraping data from users. According to the Information Commissioner, more than 1 million underage children could have been exposed to underage content on TikTok alone, with the platform collecting and using their personal data.

As the noble Baroness, Lady Kidron, has said, we already have robust and tested definitions of scope in the ICO’s age-appropriate design code—definitions increasingly taken up in other jurisdictions. To give the highest protection to children, we need to build on these secure definitions in this Bill and find the courage to extend robust protection across the internet now.

We also need to future-proof this Bill. These key amendments would ensure that any development, any new kind of service not yet imagined which meets the child user condition and enables or promotes harmful activity and content, would be in scope. This would give Ofcom the power to develop new guidance and accountabilities for the applications that are certain to come in the coming years.

We have an opportunity and a responsibility, as the Minister has said, to build the highest protection into this Bill. I support the key amendments standing in my name.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, first, I beg the indulgence of the Committee to speak briefly at this juncture. I know that no one from the Lib Dem or Labour Benches has spoken yet, but I need to dash over to the Moses Room to speak to some amendments I am moving on the Bill being considered there. Secondly, I also ask the Committee that, if I do not get back in time for the wind-ups, I be forgiven on this occasion.

I simply wanted to say something briefly in support of Amendments 19, 22, 298 and 299, to which I have added my name. My noble friend Lady Harding has already spoken to them comprehensively, so there little I want to add; I just want to emphasise a couple of points. But first, if I may, I will pick up on something the right reverend Prelate said. I think I am right in saying that the most recent Ofcom research shows that 57% of 7 year-olds such as his grandchild have their own phone, and by the time children reach the age of 12 they pretty much all have their own phone. One can only imagine that the age at which children possess their own device is going to get lower.

Turning to app stores, with which these amendments are concerned, currently it is the responsibility of parents and developers to make sure that children are prevented from accessing inappropriate content. My noble friend’s amendments do not dilute in any way the responsibility that should be held by those two very important constituent groups. All we are seeking to do is ensure that app stores, which are currently completely unregulated, take their share of responsibility for making sure that those seeking to download and then use such apps are in the age group the apps are designed for.

As has already been very powerfully explained by my noble friend and by the noble Baroness, Lady Kidron, different age ratings are being given by the two different app stores right now. It is important for us to understand, in the context of the digital markets and competition Bill, which is being introduced to Parliament today—I cannot tell noble Lords how long we have waited for that legislation and how important it is, not least because it will open up competition, particularly in app stores—that the more competition there will be across app stores and the doorways through which children can go to purchase or download apps, the more important it is that there is consistency and some regulation. That is why I support my noble friend and was very happy to add my name to her amendments.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, it is a pleasure to follow other noble Lords who have spoken. I too support this key first amendment. Clarity of purpose is essential in any endeavour. The amendment overall sets out the Bill’s aims and enhances what will be vital legislation for the world, I hope, as well as for the United Kingdom. The Government have the very welcome ambition of making Britain the safest country in the world to go online. The OSB is a giant step in that direction.

As has been said, there has been remarkable consensus across the Committee on what further measures may still be needed to improve the Bill and on this first amendment, setting out these seven key purposes. Noble Lords may be aware that in the Christian tradition the number seven is significant: in the medieval period the Church taught the dangers of the seven deadly sins, the merits of the seven virtues and the seven acts of mercy. Please speak to me later if a refresher course is needed.

Amendment 1 identifies seven deadly dangers—I think they are really deadly. They are key risks which we all acknowledge are unwelcome and destructive companions of the new technologies which bring so many benefits: risks to public health or national security; the risk of serious harm to children; the risk of new developments and technologies not currently in scope; the disproportionate risk to those who manifest one or more protected characteristics; risks that occur through poor design; risks to freedom of expression and privacy; and risks that come with low transparency and low accountability. Safety and security are surely one of the primary duties of government, especially the safety and security of children and the vulnerable. There is much that is good and helpful in new technology but much that can be oppressive and destructive. These seven risks are real and present dangers. The Bill is needed because of actual and devastating harm caused to people and communities.

As we have heard, we are living through a period of rapid acceleration in the development of AI. Two days ago, CBS broadcast a remarkable documentary on the latest breakthroughs by Google and Microsoft. The legislation we craft in these weeks needs future-proofing. That can happen only through a clear articulation of purpose so that the framework provided by the Bill continues to evolve under the stewardship of the Secretary of State and of Ofcom.

I have been in dialogue over the past five years with tech companies in a variety of contexts and I have seen a variety of approaches, from the highly responsible in some companies to the frankly cavalier. Good practice, especially in design, needs stronger regulation to become uniform. I really enjoyed the analogy from the noble Lord, Lord Allan, a few minutes ago. We would not tolerate for a moment design and safety standards in aeroplanes, cars or washing machines which had the capacity to cause harm to people, least of all to children. We should not tolerate lesser standards in our algorithms and technologies.

There is no map for the future of technology and its use, even over the rest of this decade, but this amendment provides a compass—a fixed point for navigation in the future, for which future generations will thank this Government and this House. These seven deadly dangers need to be stated clearly in the Bill and, as the noble Baroness, Lady Kidron, said, to be a North Star for both the Secretary of State and Ofcom. I support the amendment.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too support this amendment. I was at a dinner last night in the City for a group of tech founders and investors—about 500 people in a big hotel ballroom, all focused on driving the sort of positive technology growth in this country that I think everyone wants to see. The guest speaker runs a large UK tech business. He commented in his speech that tech companies need to engage with government because—he said this as if it was a revelation—all Governments turned out not to speak with one voice and that understanding what was required of tech companies by Governments is not always easy. Business needs clarity, and anyone who has run a large or small business knows that it is not really the clarity in the detail that matters but the clarity of purpose that enables you to lead change, because then your people understand why they need to change, and if they understand why, then in each of the micro-decisions they take each day they can adjust those decisions to fit with the intent behind your purpose. That is why this amendment is so important.

I have worked in this space of online safety for more than a decade, both as a technology leader and in this House. I genuinely do not believe that business is wicked and evil, but what it lacks is clear direction. The Bill is so important in setting those guardrails that if we do not make its purpose clear, we should not be surprised if the very businesses which really do want Governments to be clear do not know what we intend.

I suspect that my noble friend the Minister might object to this amendment and say that it is already in the Bill. As others have already said, I actually hope it is. If it is not, we have a different problem. The point of an upfront summary of purpose is to do precisely that: to summarise what is in what a number of noble Lords have already said is a very complicated Bill. The easier and clearer we can make it for every stakeholder to engage in the Bill, the better. If alternatively my noble friend the Minister objects to the detailed wording of this amendment, I argue that that simply makes getting this amendment right even more important. If the four noble Lords, who know far more about this subject than I will ever do in a lifetime, and the joint scrutiny committee, which has done such an outstanding job at working through this, have got the purposes of the Bill wrong, then what hope for the rest of us, let alone those business leaders trying to interpret what the Government want?

That is why it is so important that we put the purposes of the Bill absolutely at the front of the Bill, as in this amendment. If we have misunderstood that in the wording, I urge my noble friend the Minister to come back with wording on Report that truly encapsulates what the Government want.

Online Safety Bill

Lord Bishop of Oxford Excerpts
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, it is an honour and privilege to follow the noble Baroness, Lady Campbell, and all those who have spoken in this debate. As a member of your Lordships’ Committee on Artificial Intelligence and a founding member of the Centre for Data Ethics and Innovation, I have followed the slow progress of this Bill since the original White Paper. We have seen increasing evidence that many social media platforms are unwilling to acknowledge, let alone prevent, harms of the kind this vital Bill addresses. We know that there is an all too porous frontier between the virtual world and the physical world. The resulting harms damage real lives, real families, and real children, as we have heard.

There is a growing list of priority harms and now there is concern, as well as excitement, over new AIs such as ChatGPT; they demonstrate yet again that technology has no inherent precautionary principles. Without systemic checks and balances, AI in every field develops faster than society can respond. We are and for ever will be catching up with the technology.

The Bill is very welcome, marking as it does a belated but important step towards rebalancing a complex but vital aspect of public life. I pay tribute to the Government and to civil servants for their patient efforts to address a complex set of ethical and practical issues in a proportionate way. But the job is not yet fully done.

I will concentrate on three particular areas of concern with the draft Bill. First, removal of risk assessments regarding harm to adults is concerning. Surely every company has a basic moral duty to assess the risk of its products or services to customers and consumers. Removal can only undermine a risk-based approach to regulation. Can the Minister explain how conducting a risk assessment erodes or threatens freedom of speech? My second concern, mentioned by others, is the Secretary of State’s powers in relation to Ofcom. This country has a record of independence of our own media regulators. Others have touched on that, so I will not elaborate. The third area of concern I wish to raise is the Bill’s provision—or rather lack of provision—over disinformation of various kinds. I currently serve on your Lordships’ Environment and Climate Change Committee; climate disinformation and medical disinformation inflict substantial harms on society and must be included in user empowerment tools.

Other right reverend Prelates will raise their own concerns in the forthcoming Committee. My right reverend friend the Bishop of Gloucester believes that it is imperative that we prevent technology-facilitated domestic abuse, as well as bring in a code of practice to keep women and girls safe online. To help young people flourish, we should look at controlling algorithmically served content, restrictions on face and body-editing apps, as well as improving media literacy overall. She is unable to speak today, but will follow these issues closely.

The Bill is vital for the health of children and adults, and the flourishing of our whole society. I look forward to progress being made in this House.

AI in the UK (Liaison Committee Report)

Lord Bishop of Oxford Excerpts
Wednesday 25th May 2022

(2 years, 7 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord, Lord Evans, and thank him in this context for his report, which I found extremely helpful when it was published and subsequently. It has been a privilege to engage with the questions around AI over the last five years through the original AI Select Committee so ably chaired by the noble Lord, Lord Clement-Jones, in the Liaison Committee and as a founding board member for three years of the Centre for Data Ethics and Innovation. I thank the noble Lord for his masterly introduction today and other noble Lords for their contributions.

There has been a great deal of investment, thought and reflection regarding the ethics of artificial intelligence over the last five years in government, the National Health Service, the CDEI and elsewhere—in universities, with several new centres emerging, including in the universities of Oxford and Oxford Brookes, and by the Church and faith communities. Special mention should be made of the Rome Call for AI Ethics, signed by Pope Francis, Microsoft, IBM and others at the Vatican in February 2020, and its six principles of transparency, inclusion, accountability, impartiality, reliability and security. The most reverend Primate the Archbishop of Canterbury has led the formation of a new Anglican Communion Science Commission, drawing together senior scientists and Church leaders across the globe to explore, among other things, the impact of new technologies.

Despite all this endeavour, there is in this part of the AI landscape no room for complacency. The technology is developing rapidly and its use for the most part is ahead of public understanding. AI creates enormous imbalances of power with inherent risks, and the moral and ethical dilemmas are complex. We do not need to invent new ethics, but we need to develop and apply our common ethical frameworks to rapidly developing technologies and new contexts. The original AI report suggested five overarching principles for an AI code. It seems appropriate in the Moses Room to say that there were originally 10 commandments, but they were wisely whittled down by the committee. They are not perfect, in hindsight, but they are worth revisiting five years on as a frame for our debate.

The first is that artificial intelligence should be developed for the common good and benefit of humanity; as the noble Lord, Lord Holmes, eloquently said, the debate often slips straight into the harms and ignores the good. This principle is not self-evident and needs to be restated. AI brings enormous benefits in medicine, research, productivity and many other areas. The role of government must be to ensure that these benefits are to the common good—for the many, not the few. Government, not big tech, must lead. There must be a fair distribution of the wealth that is generated, a fair sharing of power through good governance and fair access to information. This simply will not happen without national and international regulation and investment.

The second principle is that artificial intelligence should operate on principles of intelligibility and fairness. This is much easier to say than to put into practice. AI is now being deployed, or could be, in deeply sensitive areas of our lives: decisions about probation, sentencing, employment, personal loans, social care—including of children—predictive policing, the outcomes of examinations and the distribution of resources. The algorithms deployed in the private and public sphere need to be tested against the criteria of bias and transparency. The governance needs to be robust. I am sure that an individualised, contextualised approach in each field is the right way forward, but government has a key co-ordinating role. As the noble Lord, Lord Clement-Jones, said, we do not yet have that robust co-ordinating body.

Thirdly, artificial intelligence should not be used to diminish the data rights or privacy of individuals, families or communities. As a society, we remain careless of our data. Professor Shoshana Zuboff has exposed the risks of surveillance capitalism and Frances Haugen, formerly of Meta, has exposed the way personal data is open to exploitation by big tech. Evidence was presented to the online safety scrutiny committee of the effects on children and adolescents of 24/7 exposure to social media. The Online Safety Bill is a very welcome and major step forward, but the need for new regulation and continual vigilance will be essential.

Fourthly, all citizens have the right to be educated to enable them to flourish mentally, emotionally and economically alongside artificial intelligence. It seems to me that of these five areas, the Government have been weakest here. A much greater investment is needed by the Department for Education and across government to educate society on the nature and deployment of AI, and on its benefits and risks. Parents need help to support children growing up in a digital world. Workers need to know their rights in terms of the digital economy, while fresh legislation will be needed to promote good work. There needs to be even better access to new skills and training. We need to strive as a society for even greater inclusion. How do the Government propose to offer fresh leadership in this area?

Finally, the autonomous power to hurt, destroy or deceive human beings should never be vested in artificial intelligence, as others have said. This final point highlights a major piece of unfinished business in both reports: engagement with the challenging and difficult questions of lethal autonomous weapons systems. The technology and capability to deploy AI in warfare is developing all the time. The time has come for a United Nations treaty to limit the deployment of killer robots of all kinds. This Government and Parliament, as the noble Lord, Lord Browne, eloquently said, urgently need to engage with this area and, I hope, take a leading role in the governance of research and development.

AI can and has brought many benefits, as well as many risks. There is great openness and willingness on the part of many working in the field to engage with the humanities, philosophers and the faith communities. There is a common understanding that the knowledge brought to us by science needs to be deployed with wisdom and humility for the common good. AI will continue to raise sharp questions of what it means to be human, and to build a society and a world where all can flourish. As many have pointed out, even the very best examples of AI as yet come nowhere near the complexity and wonder of the human mind and person. We have been given immense power to create but we are ourselves, in the words of the psalmist, fearfully and wonderfully created.

Freedom of Speech

Lord Bishop of Oxford Excerpts
Friday 10th December 2021

(3 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- Hansard - -

My Lords, it is a great privilege and honour, as always, to follow the noble and right reverend Lord, Lord Harries, one of my distinguished predecessors. I am grateful for this timely debate and to the most reverend Primate for his very comprehensive introduction. In a few days’ time, as we have heard, the scrutiny committee of both Houses will publish its report on the online safety legislation: a potentially vital web of provisions to prevent harm to individuals and, I hope, to society.

The debate around the online safety Bill will raise questions of principle about freedom of speech, and I very much support the most reverend Primate’s case that the free exchange of ideas is a keystone of our society and democracy. In many areas, as we have heard, those freedoms need a more robust defence. In others, the rights of the most vulnerable need protection from harm.

According to Proverbs 15, verse 1:

“A gentle tongue is a tree of life, but perverseness in it breaks the spirit.”


Words can be an immense blessing but, when amplified through social media, also weapons of mass destruction to people and societies. Consequently, as a society, we will need wisdom to discriminate and to make judgments about the limits and boundaries of our freedoms in the light of these new technologies, and this debate in the coming months must avoid lapsing into hollow slogans on either side.

We have seen the rapid evolution and spread of social media over less than 20 years. Regulators have struggled to keep up, or even to reach the starting line. The big tech companies at present largely set their own rules and evaluate their own compliance.

I have learned that the development of ethical guidance for new technologies is not about the invention of new moral codes or principles. It is largely about the sensible translation and application of existing moral standards to the online world, especially in the protection of children, minorities and the most vulnerable. Freedom of speech is indeed to be preserved, but it, too, must be subject, online as offline, to a yet higher law of civility and mutuality. The UK Government have decades of experience in regulating broadcast content around these tensions, and it is this experience which must now be applied to new technologies.

It must be right, therefore, that major corporations which act as publishers of potentially harmful content should have a duty of care both to individuals and to society. A greater share of the immense profits realised in advertising needs to be ploughed back into protection of the vulnerable. Algorithms must be subject to scrutiny, especially when they are shown to amplify hatred and to target those already at risk. There must be robust protection for the young through careful age verification, which is urgently needed.

Anger, hatred and vitriol are all around us because the social media companies have discovered that this is where the greatest profits lie. It would be perfectly possible for social media to bring to the top of our feed stories of faith, hope and love rather than of cruelty and venom. Honest argument and exchange of ideas is one thing, but, at present, opaque microtargeting sold to the highest bidder distorts the societal context of freedom, challenging the very nature of democracy.

A century ago, the British Government took the significant step of establishing the British Broadcasting Corporation in the face of rapidly developing new technology—then, radio. The BBC was founded in an intermediate space: on a strong ethic of public service, including freedom of speech and independence of governance. Public service broadcasting has provided a model of best practice in these debates, alongside the work of regulators.

Is it possible to imagine a similar public service provider for the 21st century, search engines free of advertising, social networking freed from the blind pursuit of profit, messaging services which do not mine our data—and all protecting the rights of the child? Perhaps the Minister could indicate in his reply whether the Government might be willing to explore this kind of radical intervention—social media in public service—in this vital area.

The existing tech sector is urgently in need of both new regulation and a wise regulator, new rules which will enable all to enjoy the benefits of technology without the dangers and, I hope, a new and match-fit regulator in Ofcom. It will be essential that Ofcom itself pays careful attention to gathering wisdom and to the ethical formation of its board and senior team.

We need a public debate on online safety that extends far beyond this Parliament, but I also hope that, as we consider the proposals that will be published in the coming days, we in this Chamber avoid a lazy caricature that uses freedom of speech as some kind of trump card to dissipate all regulation. Instead, I hope and pray that we will, through reason and argument, seek to balance the preservation of those freedoms with robust regulation and a wise and independent regulator.