Debates between Lord Bethell and Lord Knight of Weymouth during the 2019-2024 Parliament

Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2

Online Safety Bill

Debate between Lord Bethell and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I think the whole Committee is grateful to my noble friend Lady Ritchie for introducing these amendments so well.

Clearly, there is a problem. The anecdote from the noble Baroness, Lady Kidron, about the call she had had with the barrister relating to those freshers’ week offences, and the sense that people were both offenders and victims, underscored that. In my Second Reading speech I alluded to the problem of the volume of young people accessing pornography on Twitter, and we see the same on Reddit, Discord and a number of other platforms. As the noble Baroness said, it is changing what so many young people perceive to be normal about sexual relationships, and that has to be addressed.

Ofcom very helpfully provided a technical briefing on age assurance and age verification for Members of your Lordships’ House—clearly it did not persuade everybody, otherwise we would not be having this debate. Like the noble Lord, Lord Clement-Jones, I am interested in this issue of whether it is proportionate to require age verification, rather than age assurance.

For example, on Amendment 83 in my noble friend’s name in respect of search, I was trying to work out in my own mind how that would work. If someone used search to look for pornographic content and put in an appropriate set of keywords but was not logged in—so the platform would not know who they are—and if age verification was required, would they be interrupted with a requirement to go through an age-verification service before the search results were served up? Would the search results be served up but without the thumbnails of images and with some of the content suppressed? I am just not quite sure what the user experience would be like with a strict age-verification regime being used, for example, in respect of search services.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

My Lords, some light can be shone on that question by thinking a little about what the gambling industry has been through in the last few years as age verification has got tougher in that area. To answer the noble Lord’s question, if someone does not log into their search and looks for a gambling site, they can find it, but when they come to try to place a bet, that is when age verification is required.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My noble friend Lord Stevenson apologises that he can no longer be with the Committee, and he apologised to me that I suddenly find myself introducing this amendment. It heads up an important group because it tackles the issue of enforcement and, in essence, how we ensure that Ofcom has all the tools it needs to persuade some of the richest, largest and most litigious companies in the world to comply with the regime we are setting out in the Bill. Amendment 33, which my noble friend tabled and I am moving, sets out an offence of failing to comply with a relevant duty in respect of the child safety duties, if they do so negligently, and that it would be an imprisonable offence for a senior manager or other officer. I recall that those of us who sat on the Joint Committee discussed the data protection regime and whether there could be a similarly designated officer to the data controller in companies in respect of the safety duties with which the company would have to comply.

Clearly, this amendment has now been superseded by the government amendments that were promised, and which I am sure my noble friend was looking to flush out with this amendment. Flushed they are, so I will not go into any great detail about Amendment 33, because it is better to give time to the Minister to clarify the Government’s intentions. I shall listen carefully to him, as I will to the noble Lord, Lord Curry, who has great expertise in better regulation and who, I am sure, through talking to his amendments, will give us the benefit of his wisdom on how we can make this stick.

That leaves my Amendment 219, which in essence is about the supply chain that regulated companies use. I am grateful to the noble Lords, Lord Mann and Lord Austin, and the noble Baroness, Lady Deech, for putting their names to the amendment. Their enthusiasm did not run to missing the Arsenal game and coming to support in the Chamber, but that implies great trust in my ability to speak to the amendment, for which I accept the responsibility and compliment.

The amendment was inspired by a meeting that some Members of your Lordships’ House and the other place had in an all-party group that was looking, in particular, at the problems of the incel culture online. We heard from various organisations about how incel culture relates to anti-Semitism and misogyny, and how such content proliferates and circulates around the web. It became clear that it is fairly commonplace to use things such as cloud services to store the content and that the links are then shared on platforms. On the mainstream platforms, there might be spaces where, under the regime we are discussing under the Bill now that we have got rid of the controversial “legal but harmful” category, this content might be seen to be relatively benign, certainly in the category of freedom of expression, but starts to capture the interest of the target demographic for it. They are then taken off by links into smaller, less regulated sites and then, in turn, by links into cloud services where the real harmful content is hosted.

Therefore, by way of what reads as an exceptionally complicated and difficult amendment in respect of entities A, B and C, we are trying to understand whether it is possible to bring in those elements of the supply chain, of the technical infrastructure, that are used to disseminate hateful content. Such content too often leads to young men taking their own lives and to the sort of harm that we saw in Plymouth, where that young man went on the rampage and killed a number of people. His MP was one of the Members of Parliament at that meeting. That is what I want to explore with Amendment 219, which opens the possibility for this regime to ensure that well-resourced platforms cannot hide behind other elements of the infrastructure to evade their responsibilities.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I beg the forbearance of the Committee because, despite the best efforts of the Whips, this group includes two major issues that I must tackle.

Starting with senior management liability, I thank the Minister and the entire ministerial team for their engagement on this big and important subject. I am enormously proud of the technology sector and the enormous benefits that it has brought to the economy and to society. I remain a massive champion of innovation and technology in the round. However, senior executives in the technology sphere have had a long-standing blind spot. Their manifesto is that the internet is somehow different from the rest of the real world and that nothing must stand on its way. My noble friend Lord Moylan gave that pony quite a generous trot round the arena, so I will not go through it again, but when it comes to children, they have consistently failed to take seriously their safeguarding responsibilities.

I spoke in Committee last week of my experience at the Ministry of Sound. When I saw the internet in the late 1990s, I immediately saw a wonderful opportunity to target children, to sell to them, to get past their parents and normal regulation, and to get into their homes and their wallets. Lots of other people had the same thought, and for a long time we have let them do what they like. This dereliction of their duty of care has led to significant consequences, and the noble Lord, Lord Russell, spoke very movingly about that. Those consequences are increasing all the time because of the take-up of mobile phones and computers by ever younger children. That has got to stop, and it is why we are here. That is why we have this Bill—to stop those consequences.

To change this, we cannot rely just on rhetoric, fines and self-regulation. We tried that, the experiment has failed, and we must try a different approach. We found that exhortations and a playing-it-nicely approach failed in the financial sector before the financial crisis. We remember the massive economic and societal costs of that failure. Likewise, in the tech sector, senior managers of firms big and small must be properly incentivised and held accountable for identifying and mitigating risks to children in a systematic way. That is why introducing senior management liability for child safety transgressions is critical. Senior management must be accountable for ensuring that child safety permeates the company and be held responsible when risks of serious harm arise or gross failures take place. Just think how the banks have changed their attitude since the financial crisis because of senior liability.

I am pleased that the Government have laid their own amendment, Amendment 200A. I commend the Minister for bringing that forward and am extremely grateful to him and to the whole team for their engagement around this issue. The government amendment creates a new offence, holding senior managers accountable for failure to comply with confirmation decisions from Ofcom relating to protecting children from harmful content. I hope that my noble friend will agree that it is making Ofcom’s job easier by providing clear consequences for the non-enforcement of such decisions.

It is a very good amendment, but there are some gaps, and I would like to address those. It is worrying that the government amendment does not cover duties related to tackling child sexual exploitation and abuse. As it stands, this amendment is a half-measure which fails to hold senior managers liable for the most severe abuse online. Child sexual abuse and exploitation offences are at a record high, as we heard earlier. NSPCC research shows that there has been an 84% rise in online grooming since 2017-18. Tech companies must be held accountable for playing their role in tackling this.

That is why the amendment in my name does the following: first, it increases the scope of the Government’s amendment to make individuals also responsible for confirmation decisions on illegal safety duties related to child sexual abuse and exploitation. Secondly, it brings search services into scope, including both categories of service providers, which is critical for ensuring that a culture of compliance is adopted throughout the sector.

Online Safety Bill

Debate between Lord Bethell and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.

If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.

I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.

What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.

The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.

In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

Does the noble Lord have any reflections, talking about Snap, as to how the internet has changed in our time? It was once really for adults, when it was on a PC and it was only adults who had access to it. There has, of course, been a huge explosion in child access to the internet because of the mobile phone—as we have heard, two-thirds of 10 year-olds now have a mobile phone—and an app such as Snap now has a completely different audience from the one it had five or 10 years ago. Does the noble Lord have any reflections on what the consequences of the explosion of children’s access to applications such as Snap has been on those thinking about the harms and protection of children?

Online Safety Bill

Debate between Lord Bethell and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I restate my commitment to Amendments 20, 93 and 123, which are in my name and those of the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford, and the noble Lord, Lord Stevenson, and the noble Baroness’s Amendment 74. It is a great honour to follow the noble Lord, Lord Knight. He put extremely well some key points about where there are gaps in the existing Bill. I will build on why we have brought forward these amendments in order to plug these gaps.

In doing so, I wish to say that it has been a privilege to work with the right reverend Prelate, the noble Baroness and the noble Lord, Lord Stevenson. We are not from the same political geographies, but that collaboration demonstrates the breadth of the political concern, and the strength of feeling across the Committee, about these important gaps when it comes to harms—gaps that, if not addressed, will put children at great risk. In this matter we are very strongly united. We have been through a lot together, and I believe this unlikely coalition demonstrates how powerful the feelings are.

It has been said before that children are spending an increasing amount of their lives online. However, the degree of that inflection point in the last few years has been understated, as has how much further it has got to go. The penetration of mobile phones is already around 75% of 10 year-olds—it is getting younger, and it is getting broader.

In fact, the digital world is totally inescapable in the life of a child, whether that is for a young child who is four to six years old or an older child who is 16 or 17. It is increasingly where they receive their education—I do not think that is necessarily a good thing, but that is arguable—it is where they establish and maintain their personal relationships and it is a key forum for their self-expression.

For anyone who suspects otherwise, I wish to make it clear that I firmly believe in innovation and progress, and I regard the benefits of the digital world as really positive. I would never wish to prevent children accessing the benefits of the internet, the space it creates for learning and building community, and the opportunities it opens for them. However, environments matter. The digital world is not some noble wilderness free from original sin or a perfect, frictionless marketplace where the best, nicest, and most beautiful ideas triumph. It is a highly curated experience defined by the algorithms and service agreements of the internet companies. That is why we need rules to ensure that it is a safe space for children.

I started working on my first internet business in 1995, nearly 30 years ago. I was running the Ministry of Sound, and we immediately realised that the internet was an amazing way of getting through to young people. Our target audiences were either clubbers aged over 18 or the younger brothers and sisters of clubbers who bought our merchandise. The internet gave us an opportunity to get past all the normal barriers—past parents and regulation to reach a wonderful new market. I built a good business and it worked out well for me, but those were the days before GDPR and what we understand from the internet. I know from my experience that we need to ensure that children are protected and shielded from the harms that bombard them, because there are strong incentives—mainly financial but also other, malign incentives—for bad actors to use the internet to get through to children.

Unfortunately, as the noble Baroness, Lady Kidron, pointed out, the Bill as it stands does not achieve that aim. Take, for example, contact harms, such as grooming and child sexual abuse. In February 2020, Bark, a US-based organisation that helps families manage and protect their children’s digital lives, launched an 11 year-old persona online who it called Bailey. Bailey’s online persona clearly shows that she is an ordinary 11 year-old, posting content that is ordinary for an 11 year-old. Within 30 seconds of her persona being launched online she received a like from a man whose profile picture was a penis. Within two minutes, multiple messages were received from men, and within five minutes a video call. Shortly afterwards, she received requests from men to meet up. I remind your Lordships that Bailey was 11 years old. These are not trivial content harms; these are attempts to contact a minor using the internet as a medium.

Coronavirus (COVID-19)

Debate between Lord Bethell and Lord Knight of Weymouth
Tuesday 3rd March 2020

(4 years, 8 months ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Bethell Portrait Lord Bethell
- Hansard - -

The noble Lord makes an important point. Polling to date has demonstrated that the British public have left the moment of complacency and are now seriously focused on this issue. Their trust in the Government remains high, and their engagement on solutions is profound. That feels like the right place to be.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I noted what the Minister said about schools and I support his position. However, I also note that the Chief Medical Officer for Wales is today reported as saying that the peak in infection may be in May and June, coinciding with the time when up to a million young people will be taking public examinations in large sports halls. Can he reassure me that Ofqual is having conversations with examination boards about a contingency measure for delaying those examinations if necessary, and with universities about the admissions process if A-level results come out later?

Lord Bethell Portrait Lord Bethell
- Hansard - -

The noble Lord makes an extremely important point, which I cannot answer specifically, as that would be for the Department for Education. If I may answer in the round, it is Government’s objective to avoid as much economic and social disruption as possible, while making safety our number one priority. That is our guiding star.