20 Lord Bethell debates involving the Department for Digital, Culture, Media & Sport

Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 1st Feb 2023
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Ritchie, in her search to make it clear that we do not need to take a proportionate approach to pornography. I would be delighted if the Minister could indicate in his reply that the Government will accept the age-assurance amendments in group 22 that are coming shortly, which make it clear that porn on any regulated service, under Part 3 or Part 5, should be behind an age gate.

In making the case for that, I want to say very briefly that, after the second day of Committee, I received a call from a working barrister who represented 90 young men accused of serious sexual assault. Each was a student and many were in their first year. A large proportion of the incidents had taken place during freshers’ week. She rang to make sure that we understood that, while what each and every one of them had done was indefensible, these men were also victims. As children brought up on porn, they believed that their sexual violence was normal—indeed, they told her that they thought that was what young women enjoyed and wanted. On this issue there is no proportionality.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I also support Amendments 29, 83 and 103 from the noble Baroness, Lady Ritchie. As currently drafted, the Bill makes frequent reference to Ofcom taking into account

“the size and capacity of … a service”

when it determines the extent of the measures a site should apply to protect children. We have discussed size on previous days; I am conscious that the point has been made in part, but I hope the Committee will forgive me if I repeat it clearly. When it comes to pornography and other harms to children, size does matter. As I have said many times recently, porn is porn no matter the size of the website or publisher involved with it. It does not matter whether it is run by a huge company such as MindGeek or out of a shed in London or Romania by a small gang of people. The harm of the content to children is still exactly the same.

Our particular concern is that, if the regulations from Ofcom are applied to the bigger companies, that will create a lot of space for smaller organisations which are not bending to the regulations to try to gain a competitive advantage over the larger players and occupy that space. That is the concern of the bigger players. They are very open to age verification; what concerns them is that they will face an unequal, unlevel playing field. It is a classic concern of bigger players facing regulation in the market: that bad actors will gain competitive advantage. We should be very cognisant of that when thinking about how the regulations on age verification for porn will be applied. Therefore, the measures should be applied in proportion to the risk of harm to children posed by a porn site, not in proportion to the site’s financial capacity or the impact on its revenues of basic protections for children.

In this, we are applying basic, real-world principles to the internet. We are denying its commonly held exceptionalism, which I think we are all a bit tired of. We are applying the same principles that you might apply in the real world, for instance, to a kindergarten, play centre, village church hall, local pub, corner shop or any other kind of business that brings itself in front of children. In other words, if a company cannot afford to implement or does not seem capable of implementing measures that protect children, it should not be permitted by law to have a face in front of the general public. That is the principle that we apply in the real world, and that is the principle we should be applying on the internet.

Allowing a dimension of proportionality to apply to pornography cases creates an enormous loophole in the legislation, which at best will delay enforcement for particular sites when it is litigated and at worst will disable regulatory action completely. That is why I support the amendments in the name of the noble Baroness, Lady Ritchie.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I think the whole Committee is grateful to my noble friend Lady Ritchie for introducing these amendments so well.

Clearly, there is a problem. The anecdote from the noble Baroness, Lady Kidron, about the call she had had with the barrister relating to those freshers’ week offences, and the sense that people were both offenders and victims, underscored that. In my Second Reading speech I alluded to the problem of the volume of young people accessing pornography on Twitter, and we see the same on Reddit, Discord and a number of other platforms. As the noble Baroness said, it is changing what so many young people perceive to be normal about sexual relationships, and that has to be addressed.

Ofcom very helpfully provided a technical briefing on age assurance and age verification for Members of your Lordships’ House—clearly it did not persuade everybody, otherwise we would not be having this debate. Like the noble Lord, Lord Clement-Jones, I am interested in this issue of whether it is proportionate to require age verification, rather than age assurance.

For example, on Amendment 83 in my noble friend’s name in respect of search, I was trying to work out in my own mind how that would work. If someone used search to look for pornographic content and put in an appropriate set of keywords but was not logged in—so the platform would not know who they are—and if age verification was required, would they be interrupted with a requirement to go through an age-verification service before the search results were served up? Would the search results be served up but without the thumbnails of images and with some of the content suppressed? I am just not quite sure what the user experience would be like with a strict age-verification regime being used, for example, in respect of search services.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

My Lords, some light can be shone on that question by thinking a little about what the gambling industry has been through in the last few years as age verification has got tougher in that area. To answer the noble Lord’s question, if someone does not log into their search and looks for a gambling site, they can find it, but when they come to try to place a bet, that is when age verification is required.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My noble friend Lord Stevenson apologises that he can no longer be with the Committee, and he apologised to me that I suddenly find myself introducing this amendment. It heads up an important group because it tackles the issue of enforcement and, in essence, how we ensure that Ofcom has all the tools it needs to persuade some of the richest, largest and most litigious companies in the world to comply with the regime we are setting out in the Bill. Amendment 33, which my noble friend tabled and I am moving, sets out an offence of failing to comply with a relevant duty in respect of the child safety duties, if they do so negligently, and that it would be an imprisonable offence for a senior manager or other officer. I recall that those of us who sat on the Joint Committee discussed the data protection regime and whether there could be a similarly designated officer to the data controller in companies in respect of the safety duties with which the company would have to comply.

Clearly, this amendment has now been superseded by the government amendments that were promised, and which I am sure my noble friend was looking to flush out with this amendment. Flushed they are, so I will not go into any great detail about Amendment 33, because it is better to give time to the Minister to clarify the Government’s intentions. I shall listen carefully to him, as I will to the noble Lord, Lord Curry, who has great expertise in better regulation and who, I am sure, through talking to his amendments, will give us the benefit of his wisdom on how we can make this stick.

That leaves my Amendment 219, which in essence is about the supply chain that regulated companies use. I am grateful to the noble Lords, Lord Mann and Lord Austin, and the noble Baroness, Lady Deech, for putting their names to the amendment. Their enthusiasm did not run to missing the Arsenal game and coming to support in the Chamber, but that implies great trust in my ability to speak to the amendment, for which I accept the responsibility and compliment.

The amendment was inspired by a meeting that some Members of your Lordships’ House and the other place had in an all-party group that was looking, in particular, at the problems of the incel culture online. We heard from various organisations about how incel culture relates to anti-Semitism and misogyny, and how such content proliferates and circulates around the web. It became clear that it is fairly commonplace to use things such as cloud services to store the content and that the links are then shared on platforms. On the mainstream platforms, there might be spaces where, under the regime we are discussing under the Bill now that we have got rid of the controversial “legal but harmful” category, this content might be seen to be relatively benign, certainly in the category of freedom of expression, but starts to capture the interest of the target demographic for it. They are then taken off by links into smaller, less regulated sites and then, in turn, by links into cloud services where the real harmful content is hosted.

Therefore, by way of what reads as an exceptionally complicated and difficult amendment in respect of entities A, B and C, we are trying to understand whether it is possible to bring in those elements of the supply chain, of the technical infrastructure, that are used to disseminate hateful content. Such content too often leads to young men taking their own lives and to the sort of harm that we saw in Plymouth, where that young man went on the rampage and killed a number of people. His MP was one of the Members of Parliament at that meeting. That is what I want to explore with Amendment 219, which opens the possibility for this regime to ensure that well-resourced platforms cannot hide behind other elements of the infrastructure to evade their responsibilities.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I beg the forbearance of the Committee because, despite the best efforts of the Whips, this group includes two major issues that I must tackle.

Starting with senior management liability, I thank the Minister and the entire ministerial team for their engagement on this big and important subject. I am enormously proud of the technology sector and the enormous benefits that it has brought to the economy and to society. I remain a massive champion of innovation and technology in the round. However, senior executives in the technology sphere have had a long-standing blind spot. Their manifesto is that the internet is somehow different from the rest of the real world and that nothing must stand on its way. My noble friend Lord Moylan gave that pony quite a generous trot round the arena, so I will not go through it again, but when it comes to children, they have consistently failed to take seriously their safeguarding responsibilities.

I spoke in Committee last week of my experience at the Ministry of Sound. When I saw the internet in the late 1990s, I immediately saw a wonderful opportunity to target children, to sell to them, to get past their parents and normal regulation, and to get into their homes and their wallets. Lots of other people had the same thought, and for a long time we have let them do what they like. This dereliction of their duty of care has led to significant consequences, and the noble Lord, Lord Russell, spoke very movingly about that. Those consequences are increasing all the time because of the take-up of mobile phones and computers by ever younger children. That has got to stop, and it is why we are here. That is why we have this Bill—to stop those consequences.

To change this, we cannot rely just on rhetoric, fines and self-regulation. We tried that, the experiment has failed, and we must try a different approach. We found that exhortations and a playing-it-nicely approach failed in the financial sector before the financial crisis. We remember the massive economic and societal costs of that failure. Likewise, in the tech sector, senior managers of firms big and small must be properly incentivised and held accountable for identifying and mitigating risks to children in a systematic way. That is why introducing senior management liability for child safety transgressions is critical. Senior management must be accountable for ensuring that child safety permeates the company and be held responsible when risks of serious harm arise or gross failures take place. Just think how the banks have changed their attitude since the financial crisis because of senior liability.

I am pleased that the Government have laid their own amendment, Amendment 200A. I commend the Minister for bringing that forward and am extremely grateful to him and to the whole team for their engagement around this issue. The government amendment creates a new offence, holding senior managers accountable for failure to comply with confirmation decisions from Ofcom relating to protecting children from harmful content. I hope that my noble friend will agree that it is making Ofcom’s job easier by providing clear consequences for the non-enforcement of such decisions.

It is a very good amendment, but there are some gaps, and I would like to address those. It is worrying that the government amendment does not cover duties related to tackling child sexual exploitation and abuse. As it stands, this amendment is a half-measure which fails to hold senior managers liable for the most severe abuse online. Child sexual abuse and exploitation offences are at a record high, as we heard earlier. NSPCC research shows that there has been an 84% rise in online grooming since 2017-18. Tech companies must be held accountable for playing their role in tackling this.

That is why the amendment in my name does the following: first, it increases the scope of the Government’s amendment to make individuals also responsible for confirmation decisions on illegal safety duties related to child sexual abuse and exploitation. Secondly, it brings search services into scope, including both categories of service providers, which is critical for ensuring that a culture of compliance is adopted throughout the sector.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.

If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.

I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.

What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.

The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.

In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

Does the noble Lord have any reflections, talking about Snap, as to how the internet has changed in our time? It was once really for adults, when it was on a PC and it was only adults who had access to it. There has, of course, been a huge explosion in child access to the internet because of the mobile phone—as we have heard, two-thirds of 10 year-olds now have a mobile phone—and an app such as Snap now has a completely different audience from the one it had five or 10 years ago. Does the noble Lord have any reflections on what the consequences of the explosion of children’s access to applications such as Snap has been on those thinking about the harms and protection of children?

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.

I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.

There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.

It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.

What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.

I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

It is a great honour to follow my noble friend. I completely agree with her that this is a powerful discussion and there are big problems in this area. I am grateful also to my noble friend Lord Moylan for raising this in the first place. It has been a very productive discussion.

I approach the matter from a slightly different angle. I will not talk about the fringe cases—the ones where there is ambiguity, difficulty of interpretation, or responsibility or regulatory override, all of which are very important issues. The bit I am concerned about is where primary priority content that clearly demonstrates some kind of priority offence is not followed up by the authorities at all.

The noble Lord, Lord Allan, referred to this point, although he did slightly glide over it, as though implying, if I understood him correctly, that this was not an area of concern because, if a crime had clearly been committed, it would be followed up on. My fear and anxiety is that the history of the internet over the last 25 years shows that crimes—overt and clear crimes that are there for us to see—are very often not followed up by the authorities. This is another egregious example of where the digital world is somehow exceptionalised and does not have real-world rules applied to it.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I want to clarify one point. I have had a slightly different experience, which is that for many people—women, at least—whom I have talked to recently, there is an over-enthusiasm and an over-zealous attitude to policing the speech of particular women and, as we have already heard, gender-critical women. It is often under the auspices of hate speech and there is all sorts of discussion about whether the police are spending too long trawling through social media. By contrast, if you want to get a policeman or policewoman involved in a physical crime in your area, you cannot get them to come out. So I am not entirely convinced. I think policing online speech at least is taking up far too much of the authorities’ time, not too little time, and distracting them from solving real social and criminal activity.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.

So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.

If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I would like to mention one issue that I forgot to mention, and I think it would be more efficient to pose the question now to the Minister rather than interject when he is speaking.

On the Government’s Amendments 136A, 136B and 136C on the immigration offences, the point I want to make is that online services can be literal life-savers for people who are engaged in very dangerous journeys, including journeys across the Channel. I hope the Minister will be clear that the intention here is to require platforms to deal only with content, for example, from criminals who are offering trafficking services, and that there is no intention to require platforms somehow to withdraw services from the victims of those traffickers when they are using those services in the interest of saving their own lives or seeking advice that is essential to preserving their own safety.

That would create—as I know he can imagine—real ethical and moral dilemmas, and we should not be giving any signal that we intend to require platforms to withdraw services from people who are in desperate need of help, whatever the circumstances.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I restate my commitment to Amendments 20, 93 and 123, which are in my name and those of the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford, and the noble Lord, Lord Stevenson, and the noble Baroness’s Amendment 74. It is a great honour to follow the noble Lord, Lord Knight. He put extremely well some key points about where there are gaps in the existing Bill. I will build on why we have brought forward these amendments in order to plug these gaps.

In doing so, I wish to say that it has been a privilege to work with the right reverend Prelate, the noble Baroness and the noble Lord, Lord Stevenson. We are not from the same political geographies, but that collaboration demonstrates the breadth of the political concern, and the strength of feeling across the Committee, about these important gaps when it comes to harms—gaps that, if not addressed, will put children at great risk. In this matter we are very strongly united. We have been through a lot together, and I believe this unlikely coalition demonstrates how powerful the feelings are.

It has been said before that children are spending an increasing amount of their lives online. However, the degree of that inflection point in the last few years has been understated, as has how much further it has got to go. The penetration of mobile phones is already around 75% of 10 year-olds—it is getting younger, and it is getting broader.

In fact, the digital world is totally inescapable in the life of a child, whether that is for a young child who is four to six years old or an older child who is 16 or 17. It is increasingly where they receive their education—I do not think that is necessarily a good thing, but that is arguable—it is where they establish and maintain their personal relationships and it is a key forum for their self-expression.

For anyone who suspects otherwise, I wish to make it clear that I firmly believe in innovation and progress, and I regard the benefits of the digital world as really positive. I would never wish to prevent children accessing the benefits of the internet, the space it creates for learning and building community, and the opportunities it opens for them. However, environments matter. The digital world is not some noble wilderness free from original sin or a perfect, frictionless marketplace where the best, nicest, and most beautiful ideas triumph. It is a highly curated experience defined by the algorithms and service agreements of the internet companies. That is why we need rules to ensure that it is a safe space for children.

I started working on my first internet business in 1995, nearly 30 years ago. I was running the Ministry of Sound, and we immediately realised that the internet was an amazing way of getting through to young people. Our target audiences were either clubbers aged over 18 or the younger brothers and sisters of clubbers who bought our merchandise. The internet gave us an opportunity to get past all the normal barriers—past parents and regulation to reach a wonderful new market. I built a good business and it worked out well for me, but those were the days before GDPR and what we understand from the internet. I know from my experience that we need to ensure that children are protected and shielded from the harms that bombard them, because there are strong incentives—mainly financial but also other, malign incentives—for bad actors to use the internet to get through to children.

Unfortunately, as the noble Baroness, Lady Kidron, pointed out, the Bill as it stands does not achieve that aim. Take, for example, contact harms, such as grooming and child sexual abuse. In February 2020, Bark, a US-based organisation that helps families manage and protect their children’s digital lives, launched an 11 year-old persona online who it called Bailey. Bailey’s online persona clearly shows that she is an ordinary 11 year-old, posting content that is ordinary for an 11 year-old. Within 30 seconds of her persona being launched online she received a like from a man whose profile picture was a penis. Within two minutes, multiple messages were received from men, and within five minutes a video call. Shortly afterwards, she received requests from men to meet up. I remind your Lordships that Bailey was 11 years old. These are not trivial content harms; these are attempts to contact a minor using the internet as a medium.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.

The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I did not want to interrupt the noble Lord, Lord Moylan, in full flow as he introduced the amendments, but I believe he made an error in terms of the categorisation. The error is entirely rational, because he took the logical position rather than the one in the Bill. It is a helpful error because it allows us to quiz the Minister on the rationale for the categorisation scheme.

As I read it, in Clause 86, the categories are: category 1, which is large user-to-user services; category 2A, which is search or combined services; and category 2B, which is small user-to-user services. To my boring and logical binary brain, I would expect it to be: “1A: large user-to-user”; “1B: small user-to-user”; “2A: large search”; and “2B: small search”. I am curious about why a scheme like that was not adopted and we have ended up with something quite complicated. It is not only that: we now have this Part 3/Part 5 thing. I feel that we will be confused for years to come: we will be deciding whether something is a Part 3 2B service or a Part 5 service, and we will end up with a soup of numbers and letters that do not conform to any normal, rational approach to the world.

I hope the Minister will see the logic of a level playing field to deliver a policy with widespread support across all ages and political parties. Indeed, without addressing pornography separately—and, in turn, quickly —we will pass a Bill with no discernible impact before the next general election. While they are walking to the polling station, parents will still fear what their children are looking at online. This is a quick win and a popular move, and I hope the Government will amend the Bill accordingly so that this House does not need to do so when we revisit this important issue on Report.
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, it is a tremendous honour to follow the noble Lord, Lord Browne, who put the case extremely well; I agree with every word he just said. I thank the noble Baroness, Lady Ritchie, for bringing forward this issue, which she has done extremely well. I thank Christian Action Research and Education, which has been fundamental in thinking through some of these issues and has written an extremely good brief on the subject. There is indeed an urgent need for consistent regulation of pornographic content wherever it occurs online, whether it is in Section 3, Section 4, Section 5 or wherever. That is why, with the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, I have tabled amendments to address age verification on pornography and harms in the round.

Our amendments, which we will get to on Thursday and on later days in Committee, are different from those raised by the noble Baroness, Lady Ritchie, and others, but it is worth noting that many of the principles are the same. In particular, all pornographic content should be subject to the same duties, in the interests of consistency and transparency, wherever it is. Porn is porn, regardless of where it occurs online, and it carries the same risk of harm, particularly to children, whether it is accessed on social media or on a dedicated porn site.

We know from the Children’s Commissioner’s research that, for instance, Twitter was absolutely the online platform where young people were most likely to have seen pornography. Not Pornhub or one of the big tubes—on Twitter. We also know that children will consistently watch porn on dedicated porn sites. So why do we have inconsistent regulation of pornographic content in the Bill? This is the question I address to my noble friend the Minister. We can and we will get to the debate on how we will do this—indeed, I welcome further discussion with the Minister on how, and encourage him to have conversations across the House on this.

For today, we must look at why we have inconsistent regulation for pornographic content and what that means. As currently drafted, Part 3 services and Part 5 services are not subject to the same duties, as the noble Baroness rightly pointed out. Part 3 services, which include the biggest and most popular pornographic websites, such as Pornhub and Xvideos, as well as sites that host pornographic content, such as Twitter, will not be subject to regulation, including age verification, until secondary legislation is introduced, thereby delaying regulation of the biggest porn sites until at the very least 2025, if not 2026. This will create a massively unlevel playing field which, as others have said, will disincentivise compliance across the board, as well as leaving children with unfettered access to pornography on both social media sites and other user-to-user sites such as Pornhub.

Meanwhile, whichever commercially produced pornography websites are left in Part 5 will, as has already been suggested, simply change their functionality to become user-to-user and avoid regulation for another three years. I have a way in which this can be prevented and the noble Baroness, Lady Ritchie, has her way, but for today I stand with her in asking why the Government think this lack of consistency and fragmentation in the regulation of an industry that destroys childhoods and has implications that reverberate across society are to be accepted.

I look forward to hearing what the Minister has to say. It is clear to me that there is a consensus across the Committee and at every stage of the Bill that pornography should be regulated in a way that is consistent, clear and implemented as quickly as possible following Royal Assent—I have suggested within six months. Therefore, I would welcome discussions with the noble Baroness, Lady Ritchie, the Minister and others to ensure that this can be achieved.

--- Later in debate ---
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

Could the noble Lord advise us on how he would categorise a site such as Twitter, on which it is estimated that 13% of the page deliveries are to do with pornography? Does it qualify as a pornography site? To me, it is ambiguous. Such a large amount of its financial revenue comes from pages connected with pornography that it seems it has a very big foot in the pornography industry. How would he stop sites gaming definitions to benefit from one schedule or another? Does he think that puts great pressure on the regulator to be constantly moving the goalposts in order to capture who it thinks might be gaming the system, instead of focusing on content definition, which has a 50-year pedigree, is very well defined in law and is an altogether easier status to analyse and be sure about?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

The Twitter scenario, and other scenarios of mixed sites, are some of the most challenging that we have to deal with. But I would say, straightforwardly, “Look, 13% is a big chunk, but the primary purpose of Twitter is not the delivery of pornography”. I use Twitter on a daily basis and I have never seen pornography on it. I understand that it is there and that people can go for it, and that is an issue, but I think people out there would say that for most people, most of the time, the primary purpose of Twitter is not pornography.

What we want to do—in answer to the noble Lord’s second point—is create an incentive for people to be recategorised in the right direction. There is an assumption here that it is all going to be about gaming the system. I actually think that there is an opportunity here for genuine changes. There will be a conversation with Twitter. It will be interesting, given Twitter’s current management—apparently it is run by a dog, so there will be a conversation with the dog that runs Twitter. In that conversation, the regulator, Ofcom, on our behalf, will be saying, “You could change your terms of service and get rid of pornography”. Twitter will say yes or no. If it says no, Ofcom will say, “Well, here are all the things we expect you to do in order to wall off that part of the site”.

That is a really healthy and helpful conversation to have with Twitter. I expect it is listening now and already thinking about how it will respond. But it would expect that kind of treatment and conversation to be different; and I think the public would expect that conversation to be a different and better conversation than just saying “Twitter, you’re Pornhub. We’re just going to treat you like Pornhub”.

That is the distinction. As I say, we have an opportunity to get people to be more robust about either limiting or removing pornography, and I fear that the amendments we have in front of us would actually undermine rather than enhance that effort.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that is correct. I was trying to address the points raised by the noble Baroness, but the noble Lord is right. The point on whether people might try to be treated differently by allowing comments or reviews on their content is that they would be treated the same way. That is the motivation behind the noble Baroness’s amendment trying to narrow the definition. There is no risk that a publisher of pornographic content could evade their Part 5 duties by enabling comments or reviews on their content. That would be the case whether or not those reviews contained words, non-verbal indications that a user liked something, emojis or any other form of user-generated content.

That is because the Bill has been designed to confer duties on different types of content. Any service with provider pornographic content will need to comply with the Part 5 duties to ensure that children cannot normally encounter such content. If they add user-generated functionality—

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

I am sorry to come back to the same point, but let us take the Twitter example. As a publisher of pornography, does Twitter then inherit Part 5 responsibilities in as much as it is publishing pornography?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

It is covered in the Bill as Twitter. I am not quite sure what my noble friend is asking me. The harms that he is worried about are covered in different ways. Twitter or another social medium that hosts such content would be hosting it, not publishing it, so would be covered by Part 3 in that instance.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

Maybe my noble friend the Minister could write to me to clarify that point, because it is quite a significant one.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Perhaps I will speak to the noble Lord afterwards and make sure I have his question right before I do so.

I hope that answers the questions from the noble Baroness, Lady Ritchie, and that on that basis, she will be happy to withdraw her amendment.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, as I listen to the words echoing around the Chamber, I try to put myself in the shoes of parents or children who, in one way or another, have suffered as a result of exposure to things happening online. Essentially, the world that we are talking about has been allowed to grow like Topsy, largely unregulated, at a global level and at a furious pace, and that is still happening as we do this. The horses have not just bolted the stable; they are out of sight and across the ocean. We are talking about controlling and understanding an environment that is moving so quickly that, however fast we move, we will be behind it. Whatever mousetraps we put in place to try to protect children, we know there are going to be loopholes, not least because children individually are probably smarter than we are collectively at knowing how to get around well-meaning safeguards.

There are ways of testing what is happening. Certain organisations have used what they term avatars. Essentially, you create mythical profiles of children, which are clearly stated as being children, and effectively let them loose in the online world in various directions on various platforms and observe what happens. The tests that have been done on this—we will go into this in more detail on Thursday when we talk about safety by design—are pretty eye-watering. The speed with which these avatars, despite being openly stated as being profiles of children, are deluged by a variety of content that should be nowhere near children is dramatic and incredibly effective.

I put it to the Minister and the Bill team that one of the challenges for Ofcom will be not to be so far behind the curve that it is always trying to catch up. It is like being a surfer: if you are going to keep going then you have to keep on the front side of the wave. The minute you fall behind it, you are never going to catch up. I fear that, however well-intentioned so much of the Bill is, unless and until His Majesty’s Government and Ofcom recognise that we are probably already slightly behind the crest of the wave, whatever we try to do and whatever safeguards we put in place are not necessarily going to work.

One way we can try to make what we do more effective is the clever, forensic use of approaches such as avatars, not least because I suspect their efficacy will be dramatically increased by the advent and use of AI.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

Tim Cook, the CEO of Apple, put it very well:

“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.


The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.

There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.

What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.

There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.

To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.

This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.

These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.

There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.

Lord Storey Portrait Lord Storey (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I enter this Committee debate with great trepidation. I do not have the knowledge and expertise of many of your Lordships, who I have listened to with great interest. What I do have is experience working with children, for over 40 years, and as a parent myself. I want to make what are perhaps some innocent remarks.

I was glad that the right reverend Prelate the Bishop of Oxford raised the issue of online gaming. I should perhaps declare an interest, in that I think Liverpool is the third-largest centre of online gaming in terms of developing those games. It is interesting to note that over 40% of the entertainment industry’s global revenue comes from gaming, and it is steadily growing year on year.

If I am an innocent or struggle with some of these issues, imagine how parents must feel when they try to cope every single day. I suppose that the only support they currently have, other than their own common sense of course, are rating verifications or parental controls. Even the age ratings confuse them, because there are different ratings for different situations. We know that films are rated by the British Board of Film Classification, which also rates Netflix and now Amazon. But it does not rate Disney, which has its own ratings system.

We also know that the gaming industry has a different ratings system: the PEGI system, which has a number linked to an age. For example PEGI 16, if a parent knew this, says that that rating is required when depiction of violence or sexual activity reaches a stage where it looks realistic. The PEGI system also has pictures showing that.

Thanks to the Video Recordings Act 1984, the PEGI 12, PEGI 16 and PEGI 18 ratings became legally enforceable in the UK, meaning that retailers cannot sell those video games to those below those ages. If a child or young person goes in, they could not be sold those games. However, the Video Recordings Act does not currently apply to online games, meaning that children’s safety in online gaming relies primarily on parents setting up parental controls.

I will listen with great interest to the tussles between various learned Lords, as all these issues show to me that perhaps the most important issue will come several Committee days down the path, when we talk about media literacy. That is because it is not just about enforcement, regulation or ratings; it is about making sure that parents have the understanding and the capacity. Let us not forget this about young people: noble Lords have talked about them all having a phone and wanting to go on pornographic sites, but I do not think that is the case at all. Often, young people, because of peer pressure and because of their innocence, are drawn into unwise situations. Then there are the risks that gaming can lead to: for example, gaming addiction was mentioned by the right reverend Prelate the Bishop of Oxford. There is also the health impact and maybe a link with violent behaviour. There is the interactive nature of video game players, cyber bullying and the lack of a feeling of well-being. All these things can happen, which is why we need media literacy to ensure that young people know of those risks and how to cope with them.

The other thing that we perhaps need to look at is standardising some of the simple gateposts that we currently have, hence the amendment.

--- Later in debate ---
It seems really important that we stick with the principle that if it is profoundly illegal in the offline world then we cannot allow it to be perpetrated in the online world. That compass needle has been behind some of the thinking of a lot of us in trying to grapple with this issue, which is very complex for those of us who are outside the world of tech and internet and coming new to it, but who have seen the results of some of those harms perpetrated. That is where the problem arises.
Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, I violently agree with my noble friend Lord Moylan that the grouping of this amendment is unfortunate. For that reason I am not going to plunge into the issue in huge detail. but there are a couple of things I would like to reassure my noble friend on, and I have a question for the Minister.

The noble Baroness, Lady Kidron, said there is a package of amendments around age verification and that we will have a lot of time to dive into this, and I think that is probably the right format for doing it. However, I reassure my noble friend Lord Moylan that he is absolutely right. The idea is not in any way to shut off the town square from everyone simply because there might be something scary there.

Clause 11(3) refers to priority content, which the noble Lord will know is to do with child abuse and fraudulent and severely violent content. This is not just any old stuff; this is hardcore porn and the rest. As in the real world, that content should be behind an age-verification barrier. At the moment we have a situation on the internet where, because it has not been well-managed for a generation, this content has found itself everywhere: on Twitter and Reddit, and all sorts of places where really it should not be because there are children there. We envisage a degree of tidying up of social media and the internet to make sure that the dangerous content is put behind age verification. What we are not seeking to do, and what would not be a benign or positive action, is to put the entire internet behind some kind of age-verification boundary. From that point of view, I completely agree with my noble friend.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, as might be expected, I will speak against Amendment 26 and will explain why.

The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.

Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - -

My Lords, if a child goes to the Windmill club, the most famous strip club in Soho, the bouncers will rightly turn them away, no ifs, no buts: no entry, full stop. If a child tries to buy a knife on Amazon or to place a bet on Bet365.com, it will be the same story: you need proof of age. But every day, millions of children in this country watch pornography in their homes, at schools, on the bus, on devices of all kinds, without any hindrance at all. The Children’s Commissioner makes it really clear that this is not just raunchy pornography like in the old days of Razzle magazine. These are depictions of degradation, sexual coercion, aggression and exploitation, disproportionately targeted at teenage girls. As Dame Rachel de Souza said:

“Most of it is just plain abuse”.


The effects of this failed experiment are absolutely disastrous. The British Board of Film Classification says that half of 11 year-olds have seen porn, and according to the NSPCC, a third of child abuse offences are now committed by children. The answer is straight- forward in principle: we need to apply the rules on age verification for porn that exist in the real world to the online world. We need to address this harm immediately, before any more damage is done—before there is any metaverse or any more technology to spread it further.

I know that the Minister, the Secretary of State and the Prime Minister all broadly agree with this sentiment, and that is why the Bill has:

“A duty to ensure that children are not normally able to encounter content that is regulated provider pornographic content in relation to the service (for example, by using age verification).”


But this vague power simply starts a long process of negotiation with the porn industry and with tech. At a very minimum, it will require a children protection consultation, a child’s access assessment, a guidance statement, an agreement on child protection guidance and codes, secondary legislation, parliamentary approval of the Ofcom child protection code, monitoring and engagement, engagement on the enforcement regime, test cases in the courts—and so on.

I appreciate that we are creating laws flexible enough to cope with technological evolution and I totally support that principle, but we should not reinvent the wheel. We tried that 30 years ago when the online porn industry started, and it failed. We need one regime for the real world and for the online world. This is an opportunity to send a message to the tech industries and to the British people that we mean business about protecting children, and to put Britain at the vanguard of child protection regulation.

I want to see this Bill on the statute book, and I am very grateful for engagement with the Minister, the Bill team and all those supporting the Bill. I look forward to suggestions on how we can close this gap. But if we cannot, I will table amendments that replace Part 5 of the Online Safety Bill with Part 3 of the Digital Economy Bill—a measure that has considerable support in another place.

Gambling Advertising

Lord Bethell Excerpts
Tuesday 1st March 2022

(2 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

My Lords, I am a champion of innovation, and I pay tribute to the gambling industry for the remarkable innovations it has made. I am not a big gambler—I enjoy the odd flutter—but I have seen a massive change in the level of entertainment that people get out of gambling. The industry has driven gambling to new audiences, and the way in which you can now gamble on sports is incredibly impressive. It uses advertising to reach audiences it has never reached before, and the image of the old bookie by the racecourse has been replaced by a high-tech company using the latest algorithms and behavioural techniques.

I must warn the industry, however, that with this immense power—the power of innovation, computers and psychological and behavioural science—comes responsibility. I am extremely concerned that it is in a state of denial about the impact of its innovation, particularly on the most vulnerable. We cannot continue to subsidise the industry for the £1.27 billion-worth of harms that is calculated to be affecting our society. We cannot have the NHS helping to look after tens of thousands or hundreds of thousands of patients with acute gambling addiction. Some 246,000 people are estimated to have severe gambling harm-related illnesses. That is too heavy a load for our society to be carrying.

I ask the industry and the Minister to consider measures to protect two groups in particular. My noble friend Lady Chisholm talked about children, an area that concerns me in particular. With four small children, I know how much access they have to digital communications, and with a strong interest in sport, they are very easily lured into gambling of all kinds. Digital companies, which is what gambling companies have become, owe it to themselves and to society to make sure that our children are protected.

Secondly, the gambling industry has shown a long-standing generational lack of responsibility to those with severe mental illness when it comes to gambling. Time and again, casinos, and now the digital companies, have not stepped up to their responsibilities by cutting off those who cannot afford their own addiction. They should be using innovation and their digital insight to make sure that those who cannot afford to gamble are not allowed to gamble. The fines given to 888, which were in the papers this morning, are disgraceful. An NHS worker who was paid £1,400 a month was given a gambling gap of £1,300. That is not reasonable.

The industry owes it to itself to step up to these responsibilities, and I urge the Minister to look at ways in which the Government can help it do that.

Gambling: Children and Young People

Lord Bethell Excerpts
Monday 6th December 2021

(2 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, the Gambling Industry Code for Socially Responsible Advertising requires paid-for social media adverts to be targeted only at people aged 25 and above and YouTube content produced by an operator’s own YouTube channels must be restricted to accounts verified as being 18 and above. However, all this will be looked at as part of the Gambling Act review.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

My Lords, Twitter says it would never knowingly market to minors, yet our experience and the report make it clear that that just does not work. Some people want to see these adverts, but I come back to the question of opt-ins and ask the Minister if he will commit to an opt-in protocol for advertising for gambling.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My noble friend tempts me to pre-empt the work of the Gambling Act review, which is ongoing. It is certainly looking at issues such as that.

Cairncross Review

Lord Bethell Excerpts
Thursday 6th February 2020

(4 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron
- Hansard - - - Excerpts

To ask Her Majesty’s Government what steps they are taking in response to The Cairncross Review: a sustainable future for journalism.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - -

My Lords, I gently remind the House of the three-minute time limit. This is a time-limited debate, and it would be helpful if Members could please stick to that limit.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, it has been a year since Dame Frances Cairncross published her review, A Sustainable Future for Journalism. Cairncross’s remit was

“to consider the sustainability of the production and distribution of high-quality journalism, and especially the future of the press”.

The review’s six chapters outline: the importance of high-quality journalism to democracy; the rapidly changing market; the plummeting revenues of publishers; the huge power of the online platforms; and the need to protect public interest news. Sadly, the Government’s response does not comprehensively answer Dame Frances’s nine recommendations, nor does it fully address the two intrinsically linked systemic points that she highlights—notably, the impact of platforms as mediators on the quality of the news and the asymmetry of power between platform and publishers when it comes to revenue.

I declare my interests as set out in the register, particularly as a member of the House of Lords’ digital democracy inquiry committee and as chair of the 5Rights Foundation.

The most urgent issue raised repeatedly by Cairncross is how new distribution models for high-quality journalism have eroded revenue. This is a sector being hollowed out before our eyes, with reduced resources to hold institutions to account, as the platform model drives down quality in pursuit of profit. In her introduction, Cairncross points out:

“People read more sources of news online, but spend less time reading it than they did in print. They increasingly skim, scroll or passively absorb news, much of it ‘pushed’ news”,

which is

“based on data analytics and algorithms, the operation of which are often opaque.”

Platforms such as Facebook, Twitter, Google and YouTube measure views, likes and retweets, not the quality of the news they share. Under the guise of being “user first”, they are focused on building algorithms to increase engagement and, with it, their revenues—not on people’s understanding of what is happening in the world around them.

A user journey with a diet of financial, entertainment, political and international news as readers made their way from front page to sports page, has been replaced by unbundled news: bite-sized snacks driven by an opaque list of inputs that optimise user engagement; it is often difficult for readers to know or recall the source. Disaggregated news driven by commercial concerns necessarily interferes with a user journey based on editorial or public interest values. This business model enables disinformation to masquerade as news. It is not without consequences: the victims are children who get measles, pensioners who give up their savings and individuals who vote on false promises.

Cairncross recommended:

“New codes of conduct to rebalance the relationship between publishers and online platforms”,


underpinned by a news quality obligation under regulatory oversight. While the government response has warm words about these codes, it is unclear whether they are to be put on a statutory footing, silent on who will have oversight and offers no timetable. The news quality obligation becomes a vague sense that platforms must

“help users identify the reliability and trustworthiness of news sources”,

with allusions to the online harms White Paper. I do not understand why the Government commissioned a review on such an urgent matter, only for us to wait a year to hear that we will wait several more. Can the Minister outline the steps government will take to introduce new, effective codes of conduct and when we will begin to see them enforced? Also, what obstacles does she see to introducing a news quality obligation in response to the review, rather than waiting for an online harms Bill whose effect may not be felt for another couple of years?

As classified and display ads have moved wholesale from publishers to platforms, particularly Google, where targeted advertising is king, the duopoly of Google and Facebook have become eye-wateringly rich and the news sector increasingly poor. Meanwhile, news producers remain at the mercy of news feed algorithms that can, at the whim of a platform, be changed for no transparent reason, giving platforms the power to literally bury the news. Cairncross’s observation that the opaque advertising supply chain is weighted against content creators is not new. It was central to the Communications Committee’s report, UK Advertising in a Digital Age; it has been the subject of much complaint by advertisers themselves; and it is well laid out in the interim review from the CMA.

This dysfunctional business model hits the local press the hardest. The Yorkshire Evening Post showed its societal value by having local reporters when it broke the story of a child being treated on an NHS hospital floor. The subsequent false discrediting of the story on social media showed the financial value in misinformation. The editor’s plea to the digital democracy committee was that the Post needed a fairer share of the value of the content it produces. Without it, it simply cannot continue to put reporters on the front line.

Cairncross recommends an innovation fund, VAT exemption to match offline publishing and allowing local papers charitable status. The first of these is being done by NESTA, the second is being looked at by the Treasury, and the last the Government rejected outright, but at the heart of her recommendations was that the CMA should use its powers to investigate the advertising supply chain to ensure that the market be fair and transparent. Given the unanimity of this view, and the disproportionate control of the platforms, will the Minister tell the House whether she would like to see—as many of us would —the CMA move to a full market investigation to clean up the advertising supply chain?

Cairncross urged the extension of the Local Democracy Reporting Service but this has been interpreted by the Government as an extension of the BBC local news partnerships, with no additional funding, This is not an adequate response to the crisis in local journalism, nor does it fulfil the Government’s own promise to advocate for voters outside the metropole, whose local interests may be too small to be of financial value in the attention economy of the multinationals. Leaving whole parts of the country out of sight is not sustainable for our democracy.

The review also called for an Ofcom inquiry into the impact of BBC News on the commercial sector. However, I would argue that of greater concern are the recent announcements of large-scale cuts to BBC News. Amid the crisis in the local press, it is simply not the right time to undermine the BBC. In an era of catastrophically low trust, BBC News is uniquely trusted by 79% of the population—a statistic that any platform or politician would beg for.

Finally, the commitment from the Government to support media literacy is hugely welcome. The ability to identify the trustworthiness of a source and to understand the platform’s algorithms, how they impact on what you see and who benefits from your interactions is vital. But I urge the noble Baroness to make clear in her answer that media literacy is no substitute for cleaning up the hostile environment in which the news now sits.

I asked Frances Cairncross to comment on the government response to her review. She said it was

“of particular regret that the government rejected out of hand the idea of an Institute of public interest journalism.”

On another occasion, one might underline further the responsibility of the press to uphold their own editorial standards to a greater extent and better fulfil their own public interest role but, for today, I wish to congratulate Dame Frances on categorically making the case for high-quality journalism as a crucial safeguard to democracy.

I look forward to hearing from many knowledgeable colleagues and thank them in advance for their contributions. Since The Cairncross Review was published, the news sector has become more fragile, while the platforms’ power has become entrenched. I hope that the Minister—delightfully making her maiden speech in this debate—finds a way of reassuring the House that the Government intend to tackle the systemic issues that Cairncross has identified with the seriousness and urgency they require. I beg to move.