(1 year, 7 months ago)
Lords ChamberMy Lords, I am very happy to move Amendment 29 and to speak to Amendments 83 and 103, which are also in my name. We have just had a debate about the protection of children online, and this clearly follows on from that.
The intention of the Bill is to set general parameters through which different content types can be regulated. The problem with that approach, as the sheer number of amendments highlights, is this: not all content and users are the same, and therefore cannot be treated in the same way. Put simply, not all content online should be legislated for in the same way. That is why the amendments in this group are needed.
Pornography is a type of content that cannot be regulated in general terms; it needs specific provisions. I realise that some of these issues were raised in the debate last Tuesday on amendments in my name, and again on Thursday when we discussed harms to children. I recognise too that, during his response to Thursday’s debate, the Minister made a welcome announcement on primary priority content which I hope will be set out in the Bill, as we have been asking for during this debate. While we wait to see the detail of what that announcement means, I think it safe to assume that pornography will be one of the harms named on the Bill, which makes discussion of these amendments that bit more straightforward.
Given that context, in Clause 11(3), user-to-user services that fall under the scope of Part 3 of the Bill have a duty to prevent children from accessing primary priority content. This duty is repeated in Clause 25(3) for search services. That duty is, however, qualified by the words,
“using proportionate systems and processes”.
It is the word “proportionate” and how that would apply to the regulation of pornography that is at the heart of the issue.
Generally speaking, acting in a proportionate way is a sensible approach to legislation and regulation. For the most part, regulation and safeguards should ensure that a duty is not onerous or that it does not place a disproportionate cost on the service provider that may make their business unviable. While that is the general principle, proportionality is not an appropriate consideration for all policy decisions.
In the offline world, legislation and regulation is not always proportionate. This is even more stark when regulating for children. The noble Lord, Lord Bethell, raised the issue of the corner shop last Tuesday, and that example is apt to highlight my point today. We do not take a proportional approach to the sale of alcohol or cigarettes. We do not treat a corner shop differently from a supermarket. It would be absurd if I were to suggest that a small shop should apply different age checks for children when selling alcohol, compared to the age checks we expect a large supermarket to apply. Therefore, in the same way, we already do not apply proportionality to some online activities. For example, gambling is an activity that is age-verified for children. Indeed, gambling companies are not allowed to make their product attractive to children and must advertise in a regulated way to avoid harm to children and young people. The harm caused to children by gambling is significant, so the usual policy considerations of proportionality do not apply. Clearly, both online and offline, there are some goods and services to which a proportionality test is not applied; there is no subjectivity. A child cannot buy alcohol or gamble and should not be able to access pornography.
In the UK, there is a proliferation of online gambling sites. It would be absurd to argue that the size of a gambling company or the revenue that company makes should be a consideration in whether it should utilise age verification to prevent children placing a bet. In the same way, it would be absurd to argue that the size or revenue of a pornographic website could be used as an argument to override a duty to ensure that age verification is employed to ensure that children do not access that website.
This is not a grey area. It is beyond doubt that exposing children to pornography is damaging to their health and development. The Children’s Commissioner’s report from this year has been much quoted already in Committee but it is worth reminding your Lordships what she found: that pornography was “widespread and normalised”, to the extent that children cannot opt out. The average age at which children first see pornography is 13. By age nine, 10% had seen it, 27% had seen it by age 11 and half had seen it by age 13. The report found that frequent users of pornography are more likely to engage—unfortunately and sadly—in physically aggressive sex acts.
There is nothing proportionate about the damage of pornographic content. The size, number of visitors, financial budget or technical know-how must not be considerations as to whether or not to deploy age checks. If a platform is incapable for any reason of protecting children from harmful exposure to pornography, it must remove that content. The Bill should be clear: if there is pornography on a website, it must use age verification. We know that pornographic websites will do all they can to evade age verification. In France and Germany, which are ahead of us in passing legislation to protect minors from pornography, regulators are tangled up in court action as the pornographic sites they first targeted for enforcement action argue against the law.
We must also anticipate the response of websites that are not dedicated exclusively to pornography, especially social media—a point we touched on during Tuesday’s debate. Reuters reported last year that an internal Twitter presentation stated that 13% of tweets were pornographic. Indeed, the Children’s Commissioner has found that Twitter is the platform where young people are most likely to encounter pornographic content. I know that some of your Lordships are concerned about age-gating social media. No one is suggesting that social media should exclude children, a point that has been made already. What I am suggesting is that pornography on that platform should be subject to age verification. The capabilities already exist to do this. New accounts on Twitter have to opt in to view pornographic content. Why cannot the opt-in function be age-gated? Twitter is moving to subscription content. Why can it not make pornographic content subscription based, with the subscription being age-verified. The solutions exist.
The Minister may seek to reassure the House that the Bill as drafted would not allow any website or search facility regulated under Part 3 that hosts pornographic content to evade its duties because of size, capacity or cost. But, as we have seen in France, these terms will be subject to court action. I therefore trust that the Government will bring forward an amendment to ensure that any platform that hosts pornographic content will employ age verification, regardless of any other factors. Perhaps the Minister in his wind-up can provide us with some detail or a hint of a future amendment at Report. I look forward to hearing and considering the Minister’s response. I beg to move.
My Lords, I wish to speak in support of Amendments 29, 83 and 103 in the name of the noble Baroness, Lady Ritchie. I am extremely pleased that the Minister said last Tuesday that pornography will be within primary priority content; he then committed on Thursday to naming primary priority content in the Bill. This is good news. We also know that pornography will come within the child safety duties in Clause 11. This makes me very happy.
In the document produced for the Government in January 2021, the BBFC said that there were millions of pornographic websites—I repeat, millions—and many of these will come within Part 3 of the Bill because they allow users to upload videos, make comments on content and chat with other users. Of course, some of these millions of websites will be very large, which means by definition that we expect them to come within the scope of the Bill. Under Clause 11(3) user-to-user services have a duty to prevent children accessing primary priority content. The duty is qualified by the phrase
“using proportionate systems and processes”.
The facts of deciding what is proportionate are set out in Clause 11(11): the potential harm of the content based on the children’s risk assessment, and the size and capacity of the provider of the service. Amendments 29, 83 and 103 tackle the issue of size and capacity.
(1 year, 7 months ago)
Lords ChamberMy Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.
The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:
“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]
This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.
The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.
Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.
It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.
The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.
My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.
When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.
I do not see proposed new subsection (b)(iii),
“risks which can build up over time”,
mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),
“the ways in which level of risks can change when experienced in combination with others”,
which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),
“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,
starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.
(1 year, 7 months ago)
Lords ChamberMy Lords, I add my support for all the amendments in this group. I thank the noble Baroness, Lady Ritchie, for bringing the need for the consistent regulation of pornographic content to your Lordships’ attention. Last week, I spoke about my concerns about pornography; I will not repeat them here. I said then that the Bill does not go far enough on pornography, partly because of the inconsistent regulation regimes between Part 3 services and Part 5 ones.
In February, the All-Party Parliamentary Group on Commercial Sexual Exploitation made a series of recommendations on the regulation of pornography. Its first recommendation was this:
“Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres”.
It went on to say:
“The reforms currently contained in the Online Safety Bill not only fail to remedy this, they introduce further inconsistencies in how different online platforms hosting pornography are regulated”.
This is our opportunity to get it right but we are falling short. The amendments in the name of the noble Baroness, Lady Ritchie, go to the heart of the issue by ensuring that the duties that currently apply to Part 5 services will also apply to Part 3 services.
Debates about how these duties should be amended or implemented will be dealt with later on in our deliberations; I look forward to coming back to them in detail then. Today, the question is whether we are willing to have inconsistent regulation of pornographic content across the services that come into the scope of the Bill. I am quite sure that, if we asked the public in an opinion poll whether this was the outcome they expected from the Bill, they would say no.
An academic paper published in 2021 reported on the online viewing of 16 and 17 year-olds. It said that pornography was much more frequently viewed on social media, showing that the importance of the regulation of such sites remains. The impact of pornography is no different whether it is seen on a social media or pornography site with user-to-user facilities that fall within Part 3 or on a site that has only provider content that would fall within Part 5. There should not be an either/or approach to different services providing the same content, which is why I think that Amendment 125A is critical. If all pornographic content is covered by Part 5, what does and does not constitute user-generated material ceases to be our concern. Amendment 125A highlights this issue; I too look forward to hearing the Minister’s response.
There is no logic to having different regulatory approaches in the same Bill. They need to be the same and come into effect at the same time. That is the simple premise of these amendments; I fully support them.
My Lords, earlier today the noble Baroness, Lady Benjamin, referred to a group of us as kindred spirits. I suggest that all of us contributing to this debate are kindred spirits in our desire to see consistent outcomes. All of us would like to see a world where our children never see pornography on any digital platform, regardless of what type of service it is. At the risk of incurring the ire of my noble friend Lord Moylan, we should have zero tolerance for children seeing and accessing pornography.
I agree with the desire to be consistent, as the noble Baroness, Lady Ritchie, and the noble Lord, Lord Browne, said, but it is consistency in outcomes that we should focus on. I am very taken with the point made by the noble Lord, Lord Allan, that we must be very careful about the unintended consequences of a consistent regulatory approach that might end up with inconsistent outcomes.
When we get to it later—I am not sure when—I want to see a regulatory regime that is more like the one reflected in the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. We need in the Bill a very clear definition of what age assurance and age verification are. We must be specific on the timing of introducing the regulatory constraints on pornography. We have all waited far too long for that to happen and that must be in the Bill.
I am nervous of these amendments that we are debating now because I fear other unintended consequences. Not only does this not incentivise general providers, as the noble Lord, Lord Allan, described them, to remove porn from their sites but I fear that it incentivises them to remove children from their sites. That is the real issue with Twitter. Twitter has very few child users; I do not want to live in a world where our children are removed from general internet services because we have not put hard age gates on the pornographic content within them but instead encouraged those services to put an age gate on the front door. Just as the noble Lord, Lord Allan, said earlier today, I fear that, with all the best intentions, the desire to have consistent outcomes and these current amendments would regulate the high street rather than the porn itself.
(1 year, 7 months ago)
Lords ChamberMy Lords, one of our clergy in the diocese of Guildford has been campaigning for more than a decade, as have others in this Committee, on children’s access to online pornography. With her, I support the amendments in the names of the noble Baronesses, Lady Kidron and Lady Harding.
Her concerns eventually made their way to the floor of the General Synod of the Church of England in a powerful debate in July last year. The synod voted overwhelmingly in favour of a motion, which said that we
“acknowledge that our children and young people are suffering grave harm from free access to online pornography”
and urged us to
“have in place age verification systems to prevent children from having access to those sites”.
It asked Her Majesty’s Government to use their best endeavours to secure the passage and coming into force of legislation requiring age-verification systems preventing access by people under the age of 18. It also recommended more social and educational programmes to increase awareness of the harms of pornography, including self-generated sexually explicit images.
Introducing the motion, my chaplain, Reverend Jo Winn-Smith, said that age verification
“ought to be a no-brainer … Exposure to sexualised material is more likely to lead to young people engaging in more sexualised behaviour and to feel social pressure to have sex”,
as well as normalising sexual violence against girls and women. A speech from the chaplain-general of the Prison Service towards the end of the debate highlighted just where such behaviours and pressures could lead in extreme circumstances.
One major theme that emerged during the debate is highlighted by the amendments this afternoon: that access to online pornography goes far beyond materials that fall into what the Bill defines as Part 5 services. Another is highlighted in a further group of amendments: age assurance needs to be both mandatory and effective beyond reasonable doubt.
It was also commented on how this whole area has taken such an age to get on to the statute book, given David Cameron’s proposals way back in 2013 and further legislation proposed in 2018 that was never enacted. Talk of secondary legislation to define harmful content in that regard is alarming, as a further amendment indicates, given the dragging of feet that has now been perpetuated for more than a decade. That is a whole generation of children and young people.
In an imaginative speech in the synod debate, the most reverend Primate the Archbishop of York, Archbishop Stephen, reminded us that the internet is not a platform; it is a public space, where all the rights and norms you would expect in public should apply. In the 1970s, he continued, we famously put fluoride in the water supply, because we knew it would be great for dental health; now is the opportunity to put some fluoride into the internet. I add only this: let us not water down the fluoride to a point where it becomes feeble and ineffective.
My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.
As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.
The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.
While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.
I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.
Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.
My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.
In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.
I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.
My Lords, I violently agree with my noble friend Lord Moylan that the grouping of this amendment is unfortunate. For that reason I am not going to plunge into the issue in huge detail. but there are a couple of things I would like to reassure my noble friend on, and I have a question for the Minister.
The noble Baroness, Lady Kidron, said there is a package of amendments around age verification and that we will have a lot of time to dive into this, and I think that is probably the right format for doing it. However, I reassure my noble friend Lord Moylan that he is absolutely right. The idea is not in any way to shut off the town square from everyone simply because there might be something scary there.
Clause 11(3) refers to priority content, which the noble Lord will know is to do with child abuse and fraudulent and severely violent content. This is not just any old stuff; this is hardcore porn and the rest. As in the real world, that content should be behind an age-verification barrier. At the moment we have a situation on the internet where, because it has not been well-managed for a generation, this content has found itself everywhere: on Twitter and Reddit, and all sorts of places where really it should not be because there are children there. We envisage a degree of tidying up of social media and the internet to make sure that the dangerous content is put behind age verification. What we are not seeking to do, and what would not be a benign or positive action, is to put the entire internet behind some kind of age-verification boundary. From that point of view, I completely agree with my noble friend.
My Lords, as might be expected, I will speak against Amendment 26 and will explain why.
The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.
Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.
My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.
We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.
I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.
One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.
These will be controversial issues, and we will come back to them, but it is good to have started the discussion.
(1 year, 8 months ago)
Lords ChamberMy Lords, I have not engaged with this amendment in any particular detail—until the last 24 hours, in fact. I thought that I would come to listen to the debate today and see if there was anything that I could usefully contribute. I have been interested in the different points that have been raised so far. I find myself agreeing with some points that are perhaps in tension or conflict with each other. I emphasise from the start, though, my complete respect for the Joint Committee and the work that it did in the pre-legislative scrutiny of the Bill. I cannot compare my knowledge and wisdom on the Bill with those who, as has already been said, have spent so much intensive time thinking about it in the way that they did at that stage.
Like my noble friend Lady Harding, I always have a desire for clarity of purpose. It is critical for the success of any organisation, or anything that we are trying to do. As a point of principle, I like the idea of setting out at the start of this Bill its purpose. When I looked through the Bill again over the last couple of weeks in preparation for Committee, it was striking just how complicated and disjointed a piece of work it is and so very difficult to follow.
There are many reasons why I am sympathetic towards the amendment. I can see why bringing together at the beginning of the Bill what are currently described as “Purposes” might be for it to meet its overall aims. But that brings me to some of the points that the noble Baroness, Lady Fox, has just made. The Joint Committee’s report recommends that the objectives of the Bill
“should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers”—
it then set out objectives aimed at Ofcom rather than them actually being the purposes of the Bill.
I was also struck by what the noble Lord, Lord Allen, said about what we are looking for. Are we looking for regulation of the type that we would expect of airlines, or of the kind we would expect from the car industry? If we are still asking that question, that is very worrying. I think we are looking for something akin to the car industry model as opposed to the airline model. I would be very grateful if my noble friend the Minister was at least able to give us some assurance on that point.
If I were to set out a purpose of the Bill at the beginning of the document, I would limit myself to what is currently in proposed new subsection (1)(g), which is
“to secure that regulated internet services operate with transparency and accountability in respect of online safety”.
That is all I would say, because that, to me, is what this Bill is trying to do.
The other thing that struck me when I looked at this—I know that there has been an approach to this legislation that sought to adopt regulation that applies to the broadcasting world—was the thought, “Somebody’s looked at the BBC charter and thought, well, they’ve got purposes and we might adopt a similar sort of approach here.” The BBC charter and the purposes set out in it are important and give structure to the way the BBC operates, but they do not give the kind of clarity of purpose that my noble friend Lady Harding is seeking—which I too very much support and want to see—because there is almost too much there. That is my view on what the place to start would be when setting out a very simple statement of purpose for this Bill.
My Lords, this day has not come early enough for me. I am pleased to join others on embarking on the Committee stage of the elusive Online Safety Bill, where we will be going on an intrepid journey, as we have heard so far. Twenty years ago, while I was on the Ofcom content board, I pleaded for the internet to be regulated, but was told that it was mission impossible. So this is a day I feared might not happen, and I thank the Government for making it possible.
I welcome Amendment 1, in the names of the noble Lords, Lord Stevenson, Lord Clement-Jones, and others. It does indeed encapsulate the overarching purpose of the Bill. But it also sets out the focus of what other amendments will be needed if the Bill is to achieve the purpose set out in that amendment.
The Bill offers a landmark opportunity to protect children online, and it is up to us to make sure that it is robust, effective and evolvable for years to come. In particular, I welcome subsection (1)(a) and (b) of the new clause proposed by Amendment 1. Those paragraphs highlight an omission in the Bill. If the purposes set out in them are to be met, the Bill needs to go much further than it currently does.
Yes, the Bill does not go far enough on pornography. The amendment sets out a critical purpose for the Bill: children need a “higher level of protection”. The impact that pornography has on children is known. It poses a serious risk to their mental health and their understanding of consent, healthy sex and relationships. We know that children as young as seven are accessing pornographic content. Their formative years are being influenced by hardcore, abusive pornography.
As I keep saying, childhood lasts a lifetime, so we need to put children first. This is why I have dedicated my life to the protection of children and their well-being. This includes protection from pornography, where I have spent over a decade campaigning to prevent children easily accessing online pornographic content.
I know that others have proposed amendments that will be debated in due course which meet this purpose. I particularly support the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. Those amendments meet the purpose of the Bill by ensuring that children are protected from pornographic content wherever it is found through robust, anonymous age verification that proves the user’s age beyond reasonable doubt.
Online pornographic content normalises abusive sexual acts, with the Government’s own research finding
“substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women”
and children. This problem is driven largely by the types of content that are easily available online. Pornography is no longer the stereotype that we might imagine from the 1970s and 1980s. It is now vicious, violent and pervasive. Content that would be prohibited offline is readily available online for free with just a few clicks. The Online Safety Bill comes at a crucial moment to regulate online pornography. That is why I welcome the amendment introducing a purpose to the Bill that ensures that internet companies “comply with UK law”.
We have the Obscene Publications Act 1959 and UK law does not allow the offline distribution of material that sexualises children—such as “barely legal” pornography, where petite-looking adult actors are made to look like children—content which depicts incest and content which depicts sexual violence, including strangulation. That is why it is important that the Bill makes that type of material illegal online as well. Such content poses a high risk to children as well as women and girls. There is evidence that such content acts as a gateway to more hardcore material, including illegal child sexual abuse material. Some users spiral out of control, viewing content that is more and more extreme, until the next click is illegal child sexual abuse material, or even going on to contact and abuse children online and offline.
My amendment would require service providers to exclude from online video on-demand services any pornographic content that would be classified as more extreme than R18 and that would be prohibited offline. This would address the inconsistency between online and offline regulation of pornographic content—
My Lords, we have had a good-natured and informative opening debate, but we should keep our remarks to this particular amendment, in the knowledge that all future amendments will have their rightful discussion in due course.
I thank the noble Lord. I hope that the amendments I support will be supported by CEASE, Refuge and Barnardo’s—I declare an interest here. Let us not let the chance of creating a robust Online Safety Bill slip through our fingers. It is now time to act with boldness, vision, morality and determination. I trust that we will continue to focus on the purpose of the Bill: to make the online world safer, especially for our children. They are relying on us to do the right thing, so let us do so.
I strongly support my noble friend in his amendment. I clarify that, in doing so, I am occupying a guest slot on the Front Bench: I do so as a member of his team but also as a member of the former Joint Committee. As my noble friend set out, this reflects where we got to in our thinking as a Joint Committee all that time ago. My noble friend said “at last”, and I echo that and what others said. I am grateful for the many briefings and conversations that we have had in the run-up to Committee, but it is good to finally be able to get on with it and start to clear some of these things out of my head, if nothing else.
In the end, as everyone has said, this is a highly complex Bill. Like the noble Baroness, Lady Stowell, in preparation for this I had another go at trying to read the blooming thing, and it is pretty much unreadable —it is very challenging. That is right at the heart of why I think this amendment is so important. Like the noble Baroness, Lady Kidron, I worry that this will be a bonanza for the legal profession, because it is almost impenetrable when you work your way through the wiring of the Bill. I am sure that, in trying to amend it, some of us will have made errors. We have been helped by the Public Bill Office, but we will have missed things and got things the wrong way around.
It is important to have something purposive, as the Joint Committee wanted, and to have clarity of intent for Ofcom, including that this is so much more about systems than about content. Unlike the noble Baroness, Lady Stowell—clearly, we all respect her work chairing the communications committee and the insights she brings to the House—I think that a very simple statement, restricting it just to proposed new paragraph (g), is not enough. It would almost be the same as the description at the beginning of the Bill, before Clause 1. We need to go beyond that to get the most from having a clear statement of how we want Ofcom to do its job and the Secretary of State to support Ofcom.
I like what the noble Lord, Lord Allan, said about the risk of overcommitment and underdevelopment. When the right reverend Prelate the Bishop of Oxford talked about being the safest place in the world to go online, which is the claim that has been made about the Bill from the beginning, I was reminded again of the difficulty of overcommitting and underdelivering. The Bill is not perfect, and I do not believe that it will be when this Committee and this House have finished their work; we will need to keep coming back and legislating and regulating in this area, as we pursue the goal of being the safest place in the world to go online —but it will not be any time soon.
I say to the noble Baroness, Lady Fox, who I respect, that I understand what she is saying about some of her concerns about a risk-free child safety regime and the unintended consequences that may come in this legislation. But at its heart, what motivate us and make us believe that getting the Bill right is one of the most important things we will do in all of our times in this Parliament are the unintended consequences of the algorithms that these tech companies have created in pushing content at children that they do not want to hear. I see the noble Baroness, Lady Kidron, wanting to comment.
(1 year, 10 months ago)
Lords ChamberMy Lords, I support the Bill. I congratulate the noble Baroness on bringing it to the House and on her passionate, common-sense opening speech.
In September 2022, the 3 Dads Walking—Andy Airey, Tim Owen and Mike Palmer—set off from Belfast on their second walk, which was part of a month-long, 600-mile trek between all four parliaments of the UK to raise awareness of suicide prevention across the country. They are only too aware of the influence that the internet can have on vulnerable young people. Their mission to raise awareness started after losing their beautiful daughters, Sophie, Emily and Beth, to suicide.
Before this trek across the country, the 3 Dads had previously walked between their homes, from Cumbria to Manchester to Norfolk. During those walks, they heard stories from so many parents and young people about the influence that the internet had on their loved ones in making that tragic decision to take their own life. To think that those young people could have been encouraged to self-harm, and ultimately take their own lives, through social media and the internet is unforgivable. It is totally unacceptable that vulnerable young people can be encouraged so readily into suicide, can research suicide methodology and can easily access the tools to take their own lives.
Regrettably, this is a story that the 3 Dads have heard many times. I have heard the same tragic tale from both parents and teachers who are involved in counselling children and young people in schools. Social media and internet search engine companies have a duty of care to their users. Positive signposting should be the norm. A search on the internet for suicide or self-harm should result in positive signposting to available help, not to the detail to which many search engines and social media platforms currently direct the user. We have to acknowledge that suicide prevention across society is complex but it is something we need to invest in.
We must not accept that suicide is the biggest killer of the under-35s and do nothing to prevent it, or turn a blind eye to the astonishing fact that over 200 schoolchildren take their own lives every year. What has society come to? There must be education in schools about this issue and about the consequences, and to give young people hope. I hope that the Online Safety Bill, which is now being debated in this House, will also play its part by bringing in legislation to safeguard and protect children and young people. That is so necessary.
This is a generation that has grown up around the internet, and as decision-makers we must do everything in our power to make that environment as safe as possible. I passionately believe that this Bill, together with suicide prevention being taught to kids in school and robust measures in the Online Safety Bill, would be a step in the right direction. Andy, Tim, Mike and I wholeheartedly support this Bill, as it will consider and protect vulnerable young people. Most of all, it will save lives.
(1 year, 10 months ago)
Lords ChamberTo ask His Majesty’s Government what steps they are taking to address the decline in production of commercial Public Service Broadcasting children’s television content.
My Lords, I beg leave to ask the Question standing in my name on the Order Paper and declare an interest as per the register.
My Lords, the Government recognise the unique social, educational and economic importance of children’s television, and that is why we have put in place a range of measures to support it. The ongoing animation and children’s tax relief schemes have supported the production of over 840 programmes. Working with the noble Baroness, we introduced powers for Ofcom to monitor and set criteria for the provision of children’s television. Children’s television was chosen to pilot contestable funding, which has supported more than 280 hours of new content.
I thank the Minister for his Answer. However, since the early closure of the Young Audiences Content Fund, which offered up to 50% of programme budgets, the amount of newly made UK commercial children’s content continues to decrease. The children’s television production sector faces market failure and a huge challenge. Without funding, television programmes that reflect British children’s lives could disappear from the nation’s screens, and that would be a tragedy. Pact is proposing new tax breaks of 40% to help keep that vitally important sector thriving. So how are the Government living up to their responsibility to ensure that the nation’s children are accessing high-quality British children’s programming? Will the tax breaks proposed by Pact be supported to ensure that we have more UK commercial public service broadcasting of children’s content?
(1 year, 10 months ago)
Lords ChamberMy Lords, I support this important Bill, but with some concerns. As drafted, it does not go far enough to fully protect children and young people online. The consequences of the policies we decide in this Bill will affect the whole of society in decades to come.
I have been working on the online pornography issue for the last 10 years. In April 2017, this House passed legislation that required age verification for pornography websites to prevent children accessing them. We were promised that social media platforms would be included later on, but that did not happen. It is hard to believe that almost six years ago this House passed the Digital Economy Act, whose Part 3 was never implemented by this Government. So here we are, still debating age verification for pornography. This is simply unacceptable—a shocking failure of society. It is now time to act fast, and we must make sure that we do it right.
I am concerned that the Bill does not go as far as what was passed in 2017. Even if the Bill is passed, I do not believe that it will deliver age verification quickly. If Ofcom’s road map on the implementation of the Bill is to be believed, it could be three years before enforcement proceedings are issued against pornography websites that allow children to access them.
Research by the BBFC found that children as young as seven are innocently stumbling across pornography online and that 51% of all children aged 11 to 13 have watched pornography online—according to Barnardo’s, 54 million times. We are creating a conveyor belt of children addicted to porn, which will affect their long-term well-being and sexual behaviour.
A fundamental problem with the Bill is that it does not deal with pornography as a harm. The Government state that it is designed to ensure that what is lawfully unacceptable offline would also be unacceptable online. However, in respect of pornographic content, the Bill as drafted does not meet that goal. Material that is extreme and prohibited offline is widely available online. Evidence shows that consumption of extreme and prohibited material, such as content that sexualises children—and that includes adults dressing up as children—can lead on to the viewing of illegal child sexual abuse material and an interest in child sex abuse. It is not only children who are at risk: men who watch extreme and prohibited material online are more likely to be abusive towards women and girls.
What is needed is a stand-alone part of the Bill that deals with all pornographic content and sets out a clear definition of what pornography is. Once defined, the Bill should require any website or social media platform with content that meets that definition to ensure that children cannot access that material, because porn can be a gateway to other harms. Contrary to what some people believe, technology exists that can accurately age-verify a user without compromising that person’s privacy. The groundwork is done, and as more countries implement this type of legislation, the industry is becoming increasingly equipped to deal with age verification. France and Germany are already taking legal action to enforce their own laws on the largest adult websites, with several already applying age checks. There is no reason why this cannot be implemented and enforced within six months of the Bill becoming law. If that is too hard for the social media platforms, they can simply remove porn from their pages until they are ready to keep that harm away from our kids.
Childhood lasts a lifetime, and we have the opportunity to ensure that pornography is not a harm inflicted on our children. We owe it to them. I declare an interest as vice-president of Barnardo’s.
(2 years, 1 month ago)
Lords ChamberMy Lords, I thank my noble friend Lord Foster for securing this important debate and for his kind remarks. I declare my interests as set out in the register.
Public service broadcasting content is vital nourishment for children and their well-being. It is the perfect way for them to recognise and understand the world that they live in. I have vast experience, knowledge and wisdom of the importance of this because of my 46 years of broadcasting for children, through programmes such as the BBC’s “Play School”. Many say that that programme, where I took them through the arched, round and square windows, shaped their lives, helped them deal with dark moments and gave them unconditional love and confidence to face adversity.
Before we meddle with the future of public service broadcasting, let us think of our children and what we are going to replace it with. Our thinking about the future of public service media must recognise that young people are taking the lead in the adoption of new services and developing loyalty to new platforms. If we ignore that young audience, we risk losing them permanently to a diet of international content that fails to connect them to the UK’s culture, engage them in UK society or reflect them as members of that society. Remember: childhood lasts a lifetime, so if we fail to provide content that has public service purposes at its heart, then public service media will be meaningless to them as they become adults. This is why the media Bill is crucial to ensuring that PSB has prominence, is inclusive and has fair value for its content, and that those broadcasters are able to compete fairly with global streamers and secure fair value for their investment in original UK content, which in turn has clear value for the UK.
Several areas of public policy are impacting on young people right now. First, Ofcom had identified market failure in the provision of children’s television in the UK, mainly associated with the reluctance of the commercial public service broadcasters to commission new content for children and teenagers. So the young audience content fund, which I persuaded the then Digital Minister Margot James to give to children’s production, was a successful pilot which addressed market failure. It disbursed £44 million over three years and provided 50% of the funding needed to generate new content for children and young people on ITV, Channel 4 and Channel 5. It coincided with the new regulatory powers that Ofcom was granted through the amendment I proposed to what is now the Digital Economy Act 2017. I fought hard for that amendment in this House. At last, Ofcom could insist that public service broadcasters provide programmes for young people. Ofcom worked with broadcasters, allowing them flexibility to run content on their dedicated children’s channels or online. It was a perfect carrot and stick approach.
However, the fund was abruptly brought to a halt when the DCMS did a deal with the BBC in the current licence fee settlement to abandon contestable funding. This was the death-knell for the young audience fund—for broadcasters, producers and the audience, who benefited from the dozens of new projects that it generated. This was a successful new way of tackling market failure. It addressed the issue of “If you can’t see it, you can’t be it”. For a child to truly aspire—for a child to feel welcome in their own culture and the broader culture of the country in which they live—they need to be included in the media they consume. The fund achieved this. We urgently need to create a successor to the fund to support a plural system of public service provision, such as through levies or lottery funding, not by top-slicing the licence fee.
The second policy impact on the children’s audience is the BBC children’s department. It is currently focusing on animation, which is expensive and will impact on the amount of commissioning of live-action content for kids. But Ofcom must not roll back the requirement for the BBC to commission content for children in a variety of genres, and at a reasonable number of hours per year. We know that the BBC plans to place all CBBC content on the iPlayer in the long term. While positioning BBC content on platforms such as YouTube could be a way of recapturing the lost audience, this should not be done at the expense of independent producers, who currently hold the rights to exploit their content on digital platforms. Clearly, there is pressure on the finance for children’s content at the BBC, so we need to ensure that the BBC is funded securely and in a fair manner. Without secure and adequate funding, the BBC will short-change the children’s audience and continue to lose them to YouTube, TikTok and the streamers.
Another important aspect of public service broadcasting is that it provides something for everyone. Channel 4 is currently helping the UK fulfil that promise through its commitment to 13 to 16-year-olds. A fully commercial Channel 4 would be highly unlikely to serve this niche group. Once again, public policy decisions will impact heavily on this audience. A privatised Channel 4 would not serve the young children’s audience well. I am frightened to say that as a result of public policy decisions, market failure is back with a vengeance. Will the Government commit to finding new methods of funding competitive, trusted public service content for children and young people? They are the future; please, let us not fail them.
(2 years, 1 month ago)
Lords ChamberTo ask His Majesty’s Government what plans they have to implement Part 3 of the Digital Economy Act 2017 to protect children from online pornography, until Ofcom begins any enforcement of the same under the Online Safety Bill.
The Government have decided to use the Online Safety Bill to protect children from online pornography. This will provide greater protection to children across a wider range of services, and we expect that it will be implemented as quickly as the Digital Economy Act—if not more so. The Government are committed to bringing the Bill back to Parliament and are working closely with Ofcom to ensure that the implementation period following passage of the legislation is as short as possible.
My Lords, 18 months ago I urged Ministers to commence Part 3 of the Digital Economy Act, so that we can put protection from harmful pornography in place for children. I was told that that would take two years, so any benefits of an interim measure would be minimal at best. Since then, millions of children, as young as seven, have accessed violent online porn, in some cases causing mental health problems and the urge to sexually assault other children. We now know that Ofcom’s road map for regulation demonstrates that there will be no enforcement of the Online Safety Bill before 2025. Ofcom is taking over three years to begin enforcing laws on video-sharing platforms. Does the Minister now accept that we could have protected children three years sooner, and will the Government now commence Part 3, so that it is enforced until the new Bill is ready to replace it, and protect our vulnerable children?
I thank the noble Baroness for those questions. We must be clear about why the Digital Economy Act was criticised. It was originally criticised because it did not cover social media companies, which host a considerable quantity of pornographic material. There are also other sites that it did not consider. It also considered only ISPs as gatekeepers. A number of flaws have been identified in the Digital Economy Act and we will address those with a stronger Online Safety Bill, targeted more at children.