38 Baroness Benjamin debates involving the Department for Digital, Culture, Media & Sport

Wed 28th Feb 2024
Wed 6th Sep 2023
Wed 19th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I welcome this long-awaited Media Bill and declare an interest as per the register. The children’s television sector is in crisis. Ofcom has identified a dramatic shift in viewing habits among young people, particularly those over the age of seven. This, together with the long-term reduction in commissioning of original UK content for children, has led to a situation in which children and young people are essentially lost to public service broadcasting.

However, this Media Bill does not address these issues. The Media Bill should focus on the spaces where children are watching media now, not the spaces they have deserted. We need to ensure that our children can find public service content in places where they are now spending their screen time, which is on non-child-friendly, unregulated platforms. This crisis largely affects school-age children, where the migration of the audience to online services has reached alarming levels. High-quality, pre-school content is relatively robust, because parents control viewing on on-demand services, such as CBeebies and Channel 5’s “Milkshake!”

For older children, the problem of audience loss becomes acute. Live-action content that reflects the lives and concerns of British children is the hardest hit and is at risk of disappearing from commercial PSBs altogether. The BBC, which is the biggest provider of UK content for school-age children, has decided to focus more resources on animation to win back young viewers lost to streamers and video-sharing sites. The Children’s Media Foundation’s recent consultation revealed that the kids audience is no longer finding relevant, targeted, UK public service content, so is flocking to services such as YouTube and TikTok and watching adult content. Alarmingly, the consultation also showed that this fundamental shift in viewing is likely to be a contributing factor to the post-pandemic crisis of childhood, with severe implications for the personal well-being of a generation of young people. The lack of relevance or connection in the content contributes to a sense of isolation and increases levels of anxiety and mental health challenges.

Over the past 75 years, high-quality UK content for children has been a huge British success story and the envy of the world, but, over the past 20 years, consumption by children of traditional, regulated PSB content has been in freefall. This is partly due to the explosion of choice children have in their hands via new devices and new platforms and to the 2007 ban on advertising HFSS food to children, which saw commercial PSB investment in children’s content decline by 40% over the following decade.

Children have deserted PSB kids’ TV because they easily can, because of the affordability of technology and, crucially, because they have control of their own devices. Added to the mix is the huge rise of unregulated advertising and subscription video-on-demand platforms such as YouTube or TikTok, as well as Netflix and Disney+, where children are watching content aimed at international audiences and dominated by US content. Who do we want to be role models for our children: influencers and extremists on social media or the diverse and inclusive performers and characters on public service children’s television?

What is the answer? One was the powerful and relatively low-cost intervention of the Government’s three-year pilot of the young audiences content fund. This successful fund, which has now ended, supported the creation of quality, distinctive content for audiences up to the age of 18 on public service broadcasters and their online platforms. When I secured more powers for Ofcom in relation to children’s TV programmes in the Digital Economy Act 2017, I was pleased that ITV increased its investment in partnership with this fund. Sadly, that content got relatively small audiences on CITV and ITV Hub because revenue was very limited, partly due to restrictions on the products that can be advertised to children. Ultimately, these pressures led to the recent closure of CITV.

In contrast, YouTube alone takes in around £50 million a year in advertising revenue with unregulated children’s content. It is very difficult for PSB broadcasters to invest in kids’ TV content. They do not have the scale of kids audience or a fraction of the revenue from kids’ content that they once had. This can be addressed only by public funding in one form or another without top-slicing the licence fee, perhaps with enhanced tax incentives, a levy on streamers and online services, Lottery funding or public funding from appropriate sources.

We need regulation to ensure prominence for content rather than services on video-sharing platforms, so the Bill should empower Ofcom to consider the extension of prominence regulation not only to PSB services on streaming on-demand platforms or smart TVs but to video-sharing platforms, using algorithms and recommendation systems. Perhaps the Government should consider a public service algorithm to give prominence to certified regulated content. I will be interested to hear the Minister’s views on this idea.

Let us take this golden opportunity to make the Media Bill more future-focused on our children’s media reality by reflecting what young people are already doing, which needs support and regulation, with a public service system fit for the 21st century. Once the last children have totally abandoned regulated broadcast television for an unregulated media landscape full of content with little relevance to their lives, a vital part of the fabric that contributes to our quality of life in the UK will be irretrievably lost.

Online Safety Bill

Baroness Benjamin Excerpts
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I rise very briefly to thank the Minister for getting us to where we are today—the content of a Bill that I have advocated for over a decade. I thank the noble Baroness, Lady Kidron, for her kind words. She is my heroine.

I am so happy today to discuss the final stages of this Bill. The Minister has shown true commitment, tenacity and resilience, even through the holiday period. He has listened to the voices of noble Lords from across the House and to parents, charities and schools, and he has acted in the best interests of the future of society’s well-being. To him I say thank you. I fully support what he has to say today about measures that he has put down to safeguard children to prevent the worst type of child sexual abuse and exploitation imaginable, which, according to the IWF, has doubled in the last two years.

I am pleased that the Government have not been blown off course by those who feel that privacy is more important than child protection. I hope that Clause 122 of the Bill in relation to the use of technology notices remains unchanged in the final stages of deliberation. It will be good to have that confirmation once again today from the Minister.

On behalf of the IWF, CEASE and Barnardo’s— I declare an interest as a vice-president—we are so grateful to the Minister for the diligence, hard work and dedication to duty that he has shown. I very much look forward to continuing working closely with him, and with noble Lords from all sides of the House, to ensure that the implementation of the amendments we have all worked so hard to secure happens.

I look ahead to the review into pornography, which is often the gateway to other harms. I also look forward to working to make the UK the safest place in the world—the world is looking at us—to go online for everyone in our society, especially our children. As I always say, childhood lasts a lifetime. What a legacy we will leave for them by creating this Bill. I thank the Minister for everything that he has done—my “Play School” baby.

Online Safety Bill

Baroness Benjamin Excerpts
Although we have reason to hope that Ofcom will act more swiftly under the Online Safety Bill, we are trying to judge this on the basis of previous experience. There is disappointment at times across the House at the slow progress in enforcing the video-sharing platform regime. It is nearly three years since that regime was introduced but we have still not seen the outcome of a single investigation against a platform. Greater communication and clarity throughout the process would go a huge way towards rebuilding that trust. I look forward to the Minister’s response, and I seek the assurances that lie at the heart of the amendment. On that basis, I commend the amendment to the House.
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I want to say “Hallelujah”. With this Bill, we have reached a landmark moment after the disappointments and obstacles that we have had over the last six years. It has been a marathon but we are now in the final straight with the finishing line in sight, after the extraordinary efforts by noble Lords on all sides of the House. I thank the Secretary of State for her commitment to this ground-breaking Bill, and the Minister and his officials for the effort they have put into it. The Minister is one of my “Play School” babies, who has done his utmost to make a difference in changing the online world. That makes me very happy.

We know that the eyes of the world are watching us because legislators around the world are looking for ways to extend the rule of law into the online world, which has become the Wild West of the 21st century, so it is critical that in our haste to reach the finishing post we do not neglect the question of enforcement. That is why I have put my name to Amendment 268C in the name of the noble Lord, Lord Weir: without ensuring that Ofcom is given effective powers for this task of unprecedented scale, the Bill we are passing may yet become a paper tiger.

The impact assessment for the Bill estimated that 25,000 websites would be in scope. Only last week, in an encouraging report by the National Audit Office on Ofcom’s readiness, we learned that the regulator’s own research has increased that estimate to 100,000, and the figure could be significantly higher. The report went on to point out that the great majority of those websites will be based overseas and will not have been regulated by Ofcom before.

The noble Lord, Lord Bethell, raised his concerns on the final day of Committee, seeking to amend the Bill to make it clear that Ofcom could take a schedule of a thousand sites to court and get them all blocked in one go. I was reassured when the Minister repeated the undertaking given by his counterpart in Committee in the other place that the Civil Procedure Rules already allow such multiparty claims. Will the Minister clarify once again that such enforcement at scale is possible and would not expose Ofcom to judicial review? That would give me peace of mind.

The question that remains for many is whether Ofcom will act promptly enough when children are at risk. I am being cautious because my experience in this area with regulators has led me not to assume that simply because this Parliament passes a law, it will be implemented. We all know the sorry tale of the Part 3 of the Digital Economy Act, when Ministers took it upon themselves not to decide when it should come into force, but to ask whether it should at all. When they announced that that should be never, the High Court took a dim view and allowed judicial review to proceed. Interestingly, the repeal of Part 3 and the clauses that replaced it may not have featured in this Bill were it not for that case—I always say that everything always happens for a reason. The amendment is a reminder to Ofcom that Parliament expects it to act, and to do so from the day when the law comes into force, not after a year’s grace period, six months or more of monitoring or a similar period of supervision before it contemplates any form of enforcement.

Many of the sites we are dealing with will not comply because this is the law; they will do so only when the business case makes compliance cheaper than the consequences of non-compliance, so this amendment is a gentle but necessary provision. If for any reason Ofcom does not think that exposing a significant number of children in this country to suicide, health harm, eating disorder or pornographic content—which is a universal plague—merits action, it will need to write a letter to the Secretary of State explaining why.

We have come too far to risk the Bill not being implemented in the most robust way, so I hope my noble friends will join me in supporting this belt-and-braces amendment. I look forward to the Minister’s response.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we welcome the government amendments in this group to bring child sexual exploitation and abuse failures into the scope of the senior manager liability and enforcement regime but consider that they do not go far enough. On the government amendments, I have a question for the Minister about whether, through Clause 122, it would be possible to require a company that was subject to action to do some media literacy as part of its harm reduction; in other words, would it be possible for Ofcom to use its media literacy powers as part of the enforcement process? I offer that as a helpful suggestion.

We share the concerns expressed previously by the noble Lord, Lord Bethell, about the scope of the senior manager liability regime, which does not cover all the child safety duties in the Bill. We consider that Amendment 268, in the name of my noble friend Lord Stevenson, would provide greater flexibility, giving the possibility of expanding the list of duties covered in the future. I have a couple of brief questions to add to my first question. Will the Minister comment on how the operation of the senior manager liability regime will be kept under review? This has, of course, been something of a contentious issue in the other place, so could the Minister perhaps tell your Lordships’ House how confident he is that the current position is supported there? I look forward to hearing from the Minister.

--- Later in debate ---
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the hour is late and I will not detain the House for long. However, I hope that the fact that we are all still sitting here at the end of a long Report stage, because we care very much about the Bill and what we are trying to achieve, will be noted by my noble friend the Minister, his officials and others who are watching. I thank my noble friend Lady Harding for so ably introducing the amendments, which I absolutely support. I was, perhaps for the first time, going to agree with something the noble Baroness, Lady Fox, said a day or so ago: that one thing we and Ofcom need to do much better is to understand the transparency of the algorithms. It is not just algorithms—this is where my knowledge ends—but other design features that make these sites addictive and harmful, and which are outside content. The Bill will not be capable of addressing even the next five years, let alone beyond that, if we do not reflect the fact that, as my noble friend Lady Harding said, it has already been amended so that one way its objectives are to be achieved is by services being required to focus on safety by design.

I hope very much that my noble friend will take up the invitation, because everybody is tired and has been looking at this Bill for so many hours and months that we are probably all word-blind. We could all do with standing back and thinking, “With the amendments made, how does it all hang together so that ultimately, we keep those we want to keep safe as safe as we possibly can?” On that basis, I support these amendments and look forward to hearing further from the Government about how they hope to keep safe those we all wish to keep safe.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I rise to support the amendment in the name of the noble Baroness, Lady Kidron. She has been such a forceful voice throughout the passage of this Bill, driven by her passion to protect children, and no more so than with the amendment in her name. That is why I feel compelled to speak up to support her. So far, we have all worked with the Government to see the safe passage of the Online Safety Bill, with strong protections for children. These amendments would be yet another excellent and unique opportunity to protect children. This is what we have been fighting for for years, and it is so uplifting that the Government have listened to us throughout the passage of this Bill—so why stop now? If the Government are saying that the Bill is being clear about harms, they should have no objection to making it explicit.

These amendments press for safety by design to be embedded in later clauses of the Bill and go hand in hand with the earlier amendment that the House so clearly supported. It is clear that the design of services and algorithms is responsible for orchestrating and manipulating the behaviour, feelings, emotions and thoughts of children who, because they are at a vulnerable stage in their development, are easily influenced. We have all witnessed the disastrous impact of the new technology which is fast encroaching upon us, and our children will not be spared from it. So it is imperative that Ofcom have the tools with which to consider and interrogate system design separately from content because, as has been said, it is not only content that is harmful: design is too. We therefore need to take a holistic approach and leave nowhere to hide for the tech companies when it comes to harms affecting our children.

As I have said before, these amendments would send a loud and clear message to the industry that it is responsible for the design of its products and has to think of the consequences for our children’s mental health and well-being when considering design. What better way to do that than for the Government to accept these amendments, in order to show that they are on the side of our children, not the global tech companies, when it comes to protecting them from harm? They need to put measures in place to ensure that the way a service is designed is subject to the online safety regime we have all fought for over the years and during the passage of this Bill.

If the Government do not accept the amendment, perhaps the issue of harmful design could be included in the welcome proposed review of pornography. It would be good to hear the Minister’s thoughts on this idea—but I am not giving him a let-off. I hope he will listen to the strength of feeling and that the Government will reconsider their position, support the amendment and complete the one main task they set out to complete with this Bill, which is to protect children from harm no matter where it rears its ugly head online.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise briefly to support my noble friend Lady Harding and to associate myself with everything she has just said. It strikes me that if we do not acknowledge that there is harm from functionality, not just content, we are not looking to the future, because functionality protects vulnerable people before the harm has happened; content relies on us having to take it down afterwards. I want to stress that algorithms and functionality disproportionately harm not just vulnerable children but vulnerable adults as well. I do not understand why, since we agreed to safety by design at the beginning of the Bill, it is not running throughout it, rather than just in the introduction. I want to lend my support these amendments this evening.

Online Safety Bill

Baroness Benjamin Excerpts
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.

A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.

Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.

The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.

The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.

She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.

Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.

Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:

“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—


that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I rise to support the amendments in the names of the intrepid noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. They fit hand in hand with the amendments that have just been debated in the previous group. Sadly, I was unable to take part in that debate because of a technical ruling, but I thank the Minister for his kind words and thank other noble Lords for what they have said. But my heart is broken, because they included age verification, for which I have campaigned for the past 12 years, and I wanted to thank the Government for finally accepting that children need to be protected from online harmful content, pornography being one example; it is the gateway to many other harms.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak very briefly. I could disagree with much of what the noble Baroness just said, but I do not need to go there.

What particularly resonates with me today is that, since I first entered your Lordships’ House at the tender age of 28 in 1981, this is the first time I can ever remember us having to rein back what we are discussing because of the presence of young people in the Public Gallery. I reflect on that, because it brings home the gravity of what we are talking about and its prevalence; we cannot run away or hide from it.

I will ask the Minister about the International Regulatory Cooperation for a Global Britain: Government Response to the OECD Review of International Regulatory Cooperation of the UK, published 2 September 2020. He will not thank me for that, because I am sure that he is already familiar and word-perfect with this particular document, which was pulled together by his noble friend, the noble Lord, Lord Callanan. I raise this because, to think that we can in any way, shape or form, with this piece of legislation, stem the tide of what is happening in the online world—which is happening internationally on a global basis and at a global level—by trying to create regulatory and legal borders around our benighted island, is just for the fairies. It is not going to happen.

Can the Minister tell us about the degree to which, at an international level, we are proactively talking to, and learning from, other regulators in different jurisdictions, which are battling exactly the same things that we are? To concentrate the Minister’s mind, I will point out what the noble Lord, Lord Callanan, committed the Government to doing nearly three years ago. First, in relation to international regulatory co-operation, the Government committed to

“developing a whole-of-government IRC strategy, which sets out the policies, tools and respective roles of different departments and regulators in facilitating this; … developing specific tools and guidance to policy makers and regulators on how to conduct IRC; and … establishing networks to convene international policy professionals from across government and regulators to share experience and best practice on IRC”.

I am sure that, between now and when he responds, he will be given a detailed answer by the Bill team, so that he can tell us exactly where the Government, his department and Ofcom are in carrying out the commitments of the noble Lord, Lord Callanan.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, although I arrived a little late, I will say, very briefly, that I support the amendments wholeheartedly. I support them because I see this as a child protection issue. People viewing AI, I believe, will lead to them going out to find real children to sexually abuse. I will not take up any more time, but I wholeheartedly agree with everything that has been said, apart from what the noble Baroness, Lady Fox, said. I hope that the Minister will look very seriously at the amendments and take them into consideration.

Online Safety Bill

Baroness Benjamin Excerpts
The measures in this group are not meant to stifle innovation or to hold back the industry—quite the opposite. My noble friend Lady Harding alluded to the Industrial Revolution; taking children out of the pits led to a great investment in, and the growth of, the coal mining industry. Setting clear tracks for progress and putting in place humane provisions create the conditions under which industries can flourish. I fear that, if we do not get this one right, we will be tripping over ourselves; the pornographic industry will become grit in the gears of industry for years to come. By being clearer and more emphatic in these measures, the Bill can be an agent for innovation and encourage a great flourishing of these very important industries.
Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I support everything that was said by the intrepid noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. I will speak to Amendment 185, which is in my name and is supported by the noble Lord, Lord Farmer. My amendment seeks to bring the regime for online pornography content in line with what exists offline.

The Video Recordings Act 1984 makes it a criminal offence to have prohibited content offline or to supply any unclassified work. Under this regulation, the BBFC will not classify any pornographic content that is illegal or material that is potentially harmful. That includes material that depicts or promotes child sex abuse, incest, trafficking, torture and harmful sexual acts. This content would not be considered R18, and so would be prohibited for DVD and Blu-ray. This also applies, under the Communications Act 2003, to a wide range of services that are regulated by Ofcom, from large providers such as ITVX or Disney+, to smaller providers including those that produce or provide pornographic content.

However, in the wild west of the online world, there is no equivalent regulation. Online pornography so far has been left to evolve without boundaries or limitations. Devastatingly, this has had a disastrous impact on child protection. Content that would be prohibited offline is widely available on mainstream pornographic websites. This includes material that promotes violent sexual activity, including strangulation; pornography that depicts incest, including that between father and daughters or brothers and sisters; and content that depicts sexual activity with adult actors made to look like children. This content uses petite, young-looking adult performers, who are made to look underage through props such as stuffed toys, lollipops and children’s clothing. This blurring of the depiction of sexual activity with adult actors who are pretending to be underage makes it so much harder to spot illegal child sex abuse material.

According to research by Dr Vera-Gray and Professor McGlynn, incest pornography is rife. Online, all of this can be accessed at the click of a button; offline, it would not be sold in sex shops. Surely this Bill should bring an end to such disparities. This content is extremely harmful: promoting violence against girls and women, sexualising children and driving the demand for real child sex abuse material, which of course is illegal.

Depictions of sexual activity with the title “teen” are particularly violent. A study analysing the content of the three most accessed pornographic websites in the UK found that the three most common words in videos containing exploitation were “schoolgirl”, “girl” and “teen”. It is clear that underage sexual activity is implied. How have we as a society arrived at a point where one of the most commonly consumed pornographic genres is sexual violence directed at children?

Our security services can confirm this too. Retired Chief Constable Simon Bailey, the former child protection lead at the National Police Chiefs’ Council, told the Independent Inquiry into Child Sex Abuse that the availability of pornography was

“creating a group of men who will look at pornography”

so much that they reach

“the point where they are simply getting no sexual stimulation from it … so the next click is child abuse imagery”.

We know that the way pornography affects the brain means that users need more and more extreme content to fulfil themselves. It is like a drug. Pornography sites know this and exploit it. They design their sites to keep users for as long as possible, so as to increase exposure to adverts and therefore their revenue. They do this by presenting a user with ever-more extreme content. In 2021, Dr Vera-Gray and Professor McGlynn found that one in every eight titles advertised to a new user described acts of sexual violence.

I recently hosted a screening of the harrowing documentary “Barely Legal” here in the House of Lords. The documentary demonstrated just how far the pornography industry will go to make a profit, using extremely young-looking adult actors in content that suggests sexual activity with underage girls. Believe it or not, the pornography industry is worth much more than Hollywood; it makes thousands and thousands of dollars per second. Its quest for money comes at the expense of child protection and of society as a whole. This cannot be allowed to continue without regulation. Enough is enough.

Interviews with offenders who view illegal child sex abuse material in the UK indicate that most had not intentionally sought out child sex abuse materials. Nine out of 10 offenders said that they first encountered child sex abuse material through online pop-ups and linked material while looking at pornography sites.

I visited Rye Hill prison in Rugby, which houses over 600 sex offenders. Many said that they were affected by viewing porn, with devastating, life-changing outcomes. The largest ever survey of offenders who watch child sex abuse material online found significant evidence that those who watch illegal material are at high risk of going on to contact or abuse a child directly. Almost half said that they sought direct contact with children through online platforms after viewing child sexual abuse material.

This is an urgent and immediate child protection issue affecting our children. These concerns were shared earlier this year by the Children’s Commissioner for England, whose research found that 79% of children had encountered violent pornography

“depicting … degrading or pain-inducing sex acts”

before they reached the age of 18. The impact that this is having on our children is immeasurable.

Online Safety Bill

Baroness Benjamin Excerpts
Baroness Healy of Primrose Hill Portrait Baroness Healy of Primrose Hill (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have put my name to Amendment 220E, in order that the Internet Watch Foundation is duly recognised for its work and there is clarity for its role in the future regulatory landscape. So far, no role has yet been agreed with Ofcom. This could have a detrimental effect on the vital work of the IWF in combating the proliferation of child sexual abuse images and videos online.

As other noble Lords have said, the work of the IWF in taking down the vile webpages depicting sexual abuse of children is vital to stemming this tide of abuse on the internet. Worryingly, self-generated images of children are on the rise, and now account for two-thirds of the content that is removed by the IWF. Seven to 10 year-olds are now the fastest-growing age group appearing in these images. As the noble Baroness, Lady Morgan, said, girls appear in 96% of the imagery the IWF removes from the internet—up almost 30 percentage points from a decade ago. The abuse of boys is also on the rise. In the past year the IWF has seen an 138% increase in images involving them, often linked to sexual extortion.

This amendment attempts to clarify the future role of the IWF, so we await the response from the Government with interest. Tackling this growing plague of child sexual abuse is going to take all the expert knowledge that can be found, and Ofcom would be strengthened in its work by formally co-operating with the IWF.

Briefly, I also support Amendment 226, in the name of my noble friend Lord Knight, to require Ofcom to establish an advocacy body for children. I raised this at Second Reading, as I believe that children must be represented not just by the Children's Commissioner, welcome though that is, but by a body that actively includes them, not just speaks for them. The role of the English Children’s Commissioner as a statutory consultee is not an alternative to advocacy. The commissioner’s role is narrowly focused on inputting into the codes of practice at the start of the regulatory cycle, not as an ongoing provider of children’s experiences online.

This body would need to be UK-wide, with dedicated staff to consistently listen to children through research projects and helplines. It will be able to monitor new harms and rapidly identify emerging risks through its direct continual contact with children. This body would assist Ofcom and strengthen its ability to keep up with new technology. The new body will be able to share insights with the regulator to ensure that decisions are based on a live understanding of children’s safety online and to act as an early warning system. Establishing such a body would increase trust in Ofcom’s ability to stay in touch with those it needs to serve and be recognised by the tech companies as a voice for children.

There must be a mechanism that ensures children’s interests and safety online are promoted and protected. Children have a right to participate fully in the digital world and have their voices heard, so that tech companies can design services that allow them to participate in an age-appropriate way to access education, friendships and entertainment in a safe environment, as the Bill intends. One in three internet users is a child; their rights cannot be ignored.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I support Amendment 220E in the names of my noble friend Lord Clement-Jones and the noble Baroness, Lady Morgan of Cotes. I also support the amendments in the name of the noble Baroness, Lady Kidron, and Amendment 226, which deals with children’s mental health.

I have spoken on numerous occasions in this place about the devastating impact child sexual abuse has and how it robs children of their childhoods. I am sure everyone here will agree that every child has the right to a childhood free of sexual exploitation and abuse. That is why I am so passionate about protecting children from some of the most shocking and obscene harm you can imagine. In the case of this amendment and child sexual abuse, we are specifically talking about crimes against children.

The Internet Watch Foundation is an organisation I am proud to support as one of its parliamentary champions, because its staff are guardian angels who work tirelessly beyond the call of duty to protect children. In April 2019, I was honoured to host the IWF’s annual report here in Parliament. I was profoundly shocked and horrified by what I heard that day and in my continued interactions with the IWF.

That day, the IWF told the story of a little girl called Olivia. Olivia was just three years old when IWF analysts saw her. She was a little girl, with big green eyes and golden-brown hair. She was photographed and filmed in a domestic setting. This could have been any bedroom or bathroom anywhere in the country, anywhere in the world. Sadly, it was her home and she was with somebody she trusted. She was in the hands of someone who should have been there to look after her and nurture her. Instead, she was subjected to the most appalling sexual abuse over several years.

The team at the IWF have seen Olivia grow up in these images. They have seen her be repeatedly raped, and the torture she was subjected to. They tracked how often they saw Olivia’s images and videos over a three-month period. She appeared 347 times. On average that is five times every single day. In three in five of those images, she was being raped and tortured. Her imagery has also been identified as being distributed on commercial websites, where people are profiting from this appalling abuse.

I am happy to say that Olivia, thankfully, was rescued by law enforcement in 2013 at the age of eight, five years after her abuse began. Her physical abuse ended when the man who stole her childhood was imprisoned, but those images remain in circulation to this day. We know from speaking with adult survivors who have experienced revictimisation that it is the mental torture that blights lives and has an impact on their ability to leave their abuse in the past.

This Bill is supposed to help children like Olivia—and believe you me, she is just one of many, many children. The scale of these images in circulation is deeply worrying. In 2022, the IWF removed a record number of 255,000 web pages containing images of the sexual abuse and exploitation of children. Each one of these web pages can contain anything from one individual image of a child like Olivia, to thousands.

The IWF’s work is vital in removing millions of images from the internet each and every year, day in, day out. These guardian angels work tirelessly to stop this. As its CEO Susie Hargreaves often tells me, the world would be a much better place if the IWF did not have to exist, because this would mean that children were not suffering from sexual abuse or having such content spread online. But sadly, there is a need for the IWF. In fact, it is absolutely vital to the online safety landscape in the UK. As yet, this Bill does not go anywhere near far enough in recognising the important contribution the IWF has to make in implementing this legislation.

Victims of sexual abuse rely upon the IWF to protect and fight for them, safe in the knowledge that the IWF is on their side, working tirelessly to prevent millions of people potentially stumbling across their images and videos. This amendment is so important because, as my noble friend said, any delay to establishing roles and responsibilities of organisations like the IWF in working with Ofcom under the regulator regime risks leaving a vacuum in which the risks to children like Olivia will only increase further.

I urge the Government to take action to ensure that Ofcom clarifies how it intends to work with the Internet Watch Foundation and acknowledges the important part it has to play. We are months away from the Bill finally receiving Royal Assent. For children like Olivia, it cannot come soon enough; but it will not work as well as it could without the involvement of the Internet Watch Foundation. Let us make sure that we get this right and safeguard our children by accepting this amendment.

Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as the noble Lord, Lord Clement-Jones, observed, we have approached this group in an interesting way, having already heard the Minister’s feelings about the amendment. As I always think, forewarned is forearmed—so at least we know our starting point, and I am sure the Minister has listened to the debate and is reflecting.

I start by welcoming government Amendment 98A. We certainly value the work of various commissioners, but this amendment does not provide for what I would call a comprehensive duty. It needs supplementing by other approaches, and these are provided for by the amendments in this group.

The noble Baronesses, Lady Morgan, Lady Benjamin and Lady Kidron, and my noble friend Lady Healy and others, have made a powerful case for the Internet Watch Foundation being the designated expert body. I too wish to pay tribute to those who tackle online child sexual exploitation and abuse. They do it on behalf of all of us, but most notably the children they seek to protect, and their work is nothing short of an act of service.

Amendment 220E is in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Morgan. Despite the recommendation by the Joint Committee that scrutinised the draft Bill in December 2021 for the Internet Watch Foundation’s role in the future regulatory landscape to be clearly identified within the timescale set, it would require a role to be agreed with Ofcom, which has not yet happened. Perhaps the Minister can give the Committee some sense of where he feels Ofcom is in respect of the inclusion of the Internet Watch Foundation.

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- Hansard - - - Excerpts

My Lords, I am very happy to move Amendment 29 and to speak to Amendments 83 and 103, which are also in my name. We have just had a debate about the protection of children online, and this clearly follows on from that.

The intention of the Bill is to set general parameters through which different content types can be regulated. The problem with that approach, as the sheer number of amendments highlights, is this: not all content and users are the same, and therefore cannot be treated in the same way. Put simply, not all content online should be legislated for in the same way. That is why the amendments in this group are needed.

Pornography is a type of content that cannot be regulated in general terms; it needs specific provisions. I realise that some of these issues were raised in the debate last Tuesday on amendments in my name, and again on Thursday when we discussed harms to children. I recognise too that, during his response to Thursday’s debate, the Minister made a welcome announcement on primary priority content which I hope will be set out in the Bill, as we have been asking for during this debate. While we wait to see the detail of what that announcement means, I think it safe to assume that pornography will be one of the harms named on the Bill, which makes discussion of these amendments that bit more straightforward.

Given that context, in Clause 11(3), user-to-user services that fall under the scope of Part 3 of the Bill have a duty to prevent children from accessing primary priority content. This duty is repeated in Clause 25(3) for search services. That duty is, however, qualified by the words,

“using proportionate systems and processes”.

It is the word “proportionate” and how that would apply to the regulation of pornography that is at the heart of the issue.

Generally speaking, acting in a proportionate way is a sensible approach to legislation and regulation. For the most part, regulation and safeguards should ensure that a duty is not onerous or that it does not place a disproportionate cost on the service provider that may make their business unviable. While that is the general principle, proportionality is not an appropriate consideration for all policy decisions.

In the offline world, legislation and regulation is not always proportionate. This is even more stark when regulating for children. The noble Lord, Lord Bethell, raised the issue of the corner shop last Tuesday, and that example is apt to highlight my point today. We do not take a proportional approach to the sale of alcohol or cigarettes. We do not treat a corner shop differently from a supermarket. It would be absurd if I were to suggest that a small shop should apply different age checks for children when selling alcohol, compared to the age checks we expect a large supermarket to apply. Therefore, in the same way, we already do not apply proportionality to some online activities. For example, gambling is an activity that is age-verified for children. Indeed, gambling companies are not allowed to make their product attractive to children and must advertise in a regulated way to avoid harm to children and young people. The harm caused to children by gambling is significant, so the usual policy considerations of proportionality do not apply. Clearly, both online and offline, there are some goods and services to which a proportionality test is not applied; there is no subjectivity. A child cannot buy alcohol or gamble and should not be able to access pornography.

In the UK, there is a proliferation of online gambling sites. It would be absurd to argue that the size of a gambling company or the revenue that company makes should be a consideration in whether it should utilise age verification to prevent children placing a bet. In the same way, it would be absurd to argue that the size or revenue of a pornographic website could be used as an argument to override a duty to ensure that age verification is employed to ensure that children do not access that website.

This is not a grey area. It is beyond doubt that exposing children to pornography is damaging to their health and development. The Children’s Commissioner’s report from this year has been much quoted already in Committee but it is worth reminding your Lordships what she found: that pornography was “widespread and normalised”, to the extent that children cannot opt out. The average age at which children first see pornography is 13. By age nine, 10% had seen it, 27% had seen it by age 11 and half had seen it by age 13. The report found that frequent users of pornography are more likely to engage—unfortunately and sadly—in physically aggressive sex acts.

There is nothing proportionate about the damage of pornographic content. The size, number of visitors, financial budget or technical know-how must not be considerations as to whether or not to deploy age checks. If a platform is incapable for any reason of protecting children from harmful exposure to pornography, it must remove that content. The Bill should be clear: if there is pornography on a website, it must use age verification. We know that pornographic websites will do all they can to evade age verification. In France and Germany, which are ahead of us in passing legislation to protect minors from pornography, regulators are tangled up in court action as the pornographic sites they first targeted for enforcement action argue against the law.

We must also anticipate the response of websites that are not dedicated exclusively to pornography, especially social media—a point we touched on during Tuesday’s debate. Reuters reported last year that an internal Twitter presentation stated that 13% of tweets were pornographic. Indeed, the Children’s Commissioner has found that Twitter is the platform where young people are most likely to encounter pornographic content. I know that some of your Lordships are concerned about age-gating social media. No one is suggesting that social media should exclude children, a point that has been made already. What I am suggesting is that pornography on that platform should be subject to age verification. The capabilities already exist to do this. New accounts on Twitter have to opt in to view pornographic content. Why cannot the opt-in function be age-gated? Twitter is moving to subscription content. Why can it not make pornographic content subscription based, with the subscription being age-verified. The solutions exist.

The Minister may seek to reassure the House that the Bill as drafted would not allow any website or search facility regulated under Part 3 that hosts pornographic content to evade its duties because of size, capacity or cost. But, as we have seen in France, these terms will be subject to court action. I therefore trust that the Government will bring forward an amendment to ensure that any platform that hosts pornographic content will employ age verification, regardless of any other factors. Perhaps the Minister in his wind-up can provide us with some detail or a hint of a future amendment at Report. I look forward to hearing and considering the Minister’s response. I beg to move.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I wish to speak in support of Amendments 29, 83 and 103 in the name of the noble Baroness, Lady Ritchie. I am extremely pleased that the Minister said last Tuesday that pornography will be within primary priority content; he then committed on Thursday to naming primary priority content in the Bill. This is good news. We also know that pornography will come within the child safety duties in Clause 11. This makes me very happy.

In the document produced for the Government in January 2021, the BBFC said that there were millions of pornographic websites—I repeat, millions—and many of these will come within Part 3 of the Bill because they allow users to upload videos, make comments on content and chat with other users. Of course, some of these millions of websites will be very large, which means by definition that we expect them to come within the scope of the Bill. Under Clause 11(3) user-to-user services have a duty to prevent children accessing primary priority content. The duty is qualified by the phrase

“using proportionate systems and processes”.

The facts of deciding what is proportionate are set out in Clause 11(11): the potential harm of the content based on the children’s risk assessment, and the size and capacity of the provider of the service. Amendments 29, 83 and 103 tackle the issue of size and capacity.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Let us leave it there.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.

The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:

“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]


This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.

The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.

Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.

It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.

The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - -

My Lords, I add my support for all the amendments in this group. I thank the noble Baroness, Lady Ritchie, for bringing the need for the consistent regulation of pornographic content to your Lordships’ attention. Last week, I spoke about my concerns about pornography; I will not repeat them here. I said then that the Bill does not go far enough on pornography, partly because of the inconsistent regulation regimes between Part 3 services and Part 5 ones.

In February, the All-Party Parliamentary Group on Commercial Sexual Exploitation made a series of recommendations on the regulation of pornography. Its first recommendation was this:

“Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres”.


It went on to say:

“The reforms currently contained in the Online Safety Bill not only fail to remedy this, they introduce further inconsistencies in how different online platforms hosting pornography are regulated”.


This is our opportunity to get it right but we are falling short. The amendments in the name of the noble Baroness, Lady Ritchie, go to the heart of the issue by ensuring that the duties that currently apply to Part 5 services will also apply to Part 3 services.

Debates about how these duties should be amended or implemented will be dealt with later on in our deliberations; I look forward to coming back to them in detail then. Today, the question is whether we are willing to have inconsistent regulation of pornographic content across the services that come into the scope of the Bill. I am quite sure that, if we asked the public in an opinion poll whether this was the outcome they expected from the Bill, they would say no.

An academic paper published in 2021 reported on the online viewing of 16 and 17 year-olds. It said that pornography was much more frequently viewed on social media, showing that the importance of the regulation of such sites remains. The impact of pornography is no different whether it is seen on a social media or pornography site with user-to-user facilities that fall within Part 3 or on a site that has only provider content that would fall within Part 5. There should not be an either/or approach to different services providing the same content, which is why I think that Amendment 125A is critical. If all pornographic content is covered by Part 5, what does and does not constitute user-generated material ceases to be our concern. Amendment 125A highlights this issue; I too look forward to hearing the Minister’s response.

There is no logic to having different regulatory approaches in the same Bill. They need to be the same and come into effect at the same time. That is the simple premise of these amendments; I fully support them.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, earlier today the noble Baroness, Lady Benjamin, referred to a group of us as kindred spirits. I suggest that all of us contributing to this debate are kindred spirits in our desire to see consistent outcomes. All of us would like to see a world where our children never see pornography on any digital platform, regardless of what type of service it is. At the risk of incurring the ire of my noble friend Lord Moylan, we should have zero tolerance for children seeing and accessing pornography.

I agree with the desire to be consistent, as the noble Baroness, Lady Ritchie, and the noble Lord, Lord Browne, said, but it is consistency in outcomes that we should focus on. I am very taken with the point made by the noble Lord, Lord Allan, that we must be very careful about the unintended consequences of a consistent regulatory approach that might end up with inconsistent outcomes.

When we get to it later—I am not sure when—I want to see a regulatory regime that is more like the one reflected in the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. We need in the Bill a very clear definition of what age assurance and age verification are. We must be specific on the timing of introducing the regulatory constraints on pornography. We have all waited far too long for that to happen and that must be in the Bill.

I am nervous of these amendments that we are debating now because I fear other unintended consequences. Not only does this not incentivise general providers, as the noble Lord, Lord Allan, described them, to remove porn from their sites but I fear that it incentivises them to remove children from their sites. That is the real issue with Twitter. Twitter has very few child users; I do not want to live in a world where our children are removed from general internet services because we have not put hard age gates on the pornographic content within them but instead encouraged those services to put an age gate on the front door. Just as the noble Lord, Lord Allan, said earlier today, I fear that, with all the best intentions, the desire to have consistent outcomes and these current amendments would regulate the high street rather than the porn itself.