Read Bill Ministerial Extracts
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 11 months ago)
Lords ChamberMy Lords, at this Second Reading of the Digital Economy Bill in your Lordships’ House it is a pleasure to speak today on a subject that I have regularly brought before the House in recent years through a considerable number of my Online Safety Bills. It is remarkable to see that the concerns raised several years ago about the number and nature of the foreign websites that are being accessed by the UK’s young people are finally being addressed. We are beginning to witness the offline protections that parents expect for their children being applied online. So it is rather an historic day. I begin by congratulating the Government, and particularly the noble Baroness, Lady Shields, on Part 3, on which I have spent quite a lot of time with her.
I am also delighted that the BBFC has been appointed as the age verification regulator. Having met, with other noble Lords, a number of leading age verification providers, I am reassured that the available technology is robust and developed on the basis of the principle of privacy by design, which means that complete anonymity can be preserved.
As we have heard, on Report in another place the Minister indicated that the Government would introduce an amendment to the Bill in your Lordships’ House on adult content filters. However, adult content filters should not be confused with the age verification checks proposed in Part 3. The checks relate specifically to pornography, whereas the adult content filters are network-level filters that offer to filter out adult content in the round: violence, drug use, gambling, self-harm, and so on. The adult content filtering regime was promoted very effectively by the previous Prime Minister, David Cameron, who acknowledged that it was being undermined by the EU net neutrality regulations on 28 October 2015, when he stated that,
“we secured an opt-out yesterday so that we can keep our family-friendly filters to protect children. I can tell the House that we will legislate to put our agreement with internet companies on this issue into the law of the land so that our children will be protected”.—[Official Report, Commons, 28/10/15; col. 344.]
In responding to my Question for Short Debate on adult content filters this year, the Minister explained that the Government have now received new legal advice that suggests that it is not actually necessary to change the law—but, to put the matter beyond doubt, she confirmed that an amendment will be made to the Bill before us today. The noble Baroness, Lady Shields, said:
“We have examined the regulation in detail, and the potential for the network-level parental filters currently offered by providers to conflict with it. We have now received clear legal advice that such network filters that can be turned off are compliant with the regulation. Article 3.1 of the regulation states: ‘End-users shall have the right to access and distribute information and content ... of their choice’. Filters that can be turned off are a matter of consumer choice. Therefore, they are allowed under the regulation”.—[Official Report, 1/12/16; col. 411.]
I am most grateful to the Minister for this clarification—or rather, I was—but I fear that it has generated a number of follow-on questions.
First, when I read the relevant sentence from Article 3.1 in full, it seemed that the reference to “choice” upon which the Government are depending relates to “terminal equipment” rather than to “content”. The full sentence says:
“End-users shall have the right to access and distribute information and content, use and provide applications and services, and use terminal equipment of their choice”.
In other words, an ISP cannot say that you must use a particular type of router to access the service that it supplies to you. Can the Minister explain why the Government believe that “choice” in Article 3.1 refers to choice about network-level filters? If I am right, and the reference to choice pertains to terminal equipment rather than network-level filters, Article 3.1 would not lift the central obligation in the regulations, described by the European Commission in these terms:
“Every European must be able to have access to the open internet and all content and service providers must be able to provide their services via a high-quality open internet. Under these rules, blocking, throttling and discrimination of internet traffic by Internet Service Providers … is not allowed in the EU, save for three exhaustive exceptions (compliance with legal obligations; integrity of the network; congestion management in exceptional and temporary situations) and users are free to use their favourite apps and services no matter the offer they subscribe to”.
Given that the “integrity of the network” and “congestion management exceptions” do not apply in the case of filtering, this leaves us with “compliance with legal obligation”. In other words, in order to be justified, filtering would need to be the result of a “legal obligation”. However, the way in which the noble Baroness, Lady Shields, described the legislation made it sound as if the proposed amendment would merely clarify that it was legal to provide unavoidable choice or default-on adult content filters if an ISP wished so to do. Specifically, she said:
“We will bring forward an amendment to the Bill in the Lords, to the effect that providers ‘may offer’ filters”.—[Official Report, 1/12/16; col. 411.]
There was no suggestion of an obligation on ISPs to provide this service. I would be grateful if the Minister could set out what the requirements will be on ISPs and, if they are less than mandatory, how they can hope to comply with the net neutrality regulations.
Secondly, I will ask the Minister about the impact of the net neutrality legislation specifically on the use of adult content filters in relation to public wi-fi in the UK. Last year, the Minister set out the Government’s achievements in child protection on the internet. She said:
“The major public wi-fi providers have made family-friendly wi-fi available wherever children are likely to be accessing the internet unsupervised. These are significant achievements”.—[Official Report, 17/7/15; cols. 859-60.]
I strongly agree, but if the criterion for being net neutral is indeed that the user can turn off the filters, I ask myself: who should be able to turn the filters off with respect to public wi-fi for their use in this context to remain legal? I hope the Minister’s legal advice covered that point and that she will be able to reassure the House that neither net neutrality requirements nor the Government’s amendment will place family-friendly wi-fi in jeopardy.
The third question that arises from my consideration of the net neutrality regulations relates to mobile phones. The definition used in the regulation, where an,
“‘internet access service’ means a publicly available electronic communications service that provides access to the internet, and thereby connectivity to virtually all end points of the internet, irrespective of the network technology and terminal equipment used”,
suggests that net neutrality applies to ISP filtering and MPO filtering. Will the Government’s amendment consequently apply to mobile phone operators as well as internet service providers to protect filtering on mobiles?
Finally, as so many of these questions rely on legal advice, would the Minister be willing to place a copy of that advice in the Library so that your Lordships can have a full understanding of the expectations that need to be met through the net neutrality requirements and of where there is latitude for child protection measures?
I do not expect that I will be given all the answers tonight, but no doubt there will be other occasions for them during the Bill’s passage through your Lordships’ House. In any event, I very much took forward to the Minister’s response and to discussing the amendments in Committee.
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 9 months ago)
Lords ChamberMy Lords, as I said at Second Reading, I am pleased to be speaking today on a subject that I have regularly brought before your Lordships’ House over recent years in my several online safety Bills—the importance of protecting children online, which is very much today’s subject. I have tabled my probing Amendment 55 so that the Government can set out their plans for which organisations will act as the age verification regulator for which sections of this Bill, as this is crucial to ensure the child protection provisions of Part 3 are successfully implemented.
My amendment would designate the British Board of Film Classification as the age verification regulator for the whole of this process. I know that many in this House and the other place were delighted to hear the Government’s announcement that the BBFC will be the notification regulator for Part 3, a position for which it will be formally designated later this year. I am sure we can all agree that it will bring a level of expertise to that role which will be really invaluable. The use of the term “notification” regulator, however, suggests that the BBFC will provide only part of the regulatory function and that another kind of regulator will have a role to play. Indeed, this was backed up by the BBFC, which stated in its evidence to the Public Bill Committee in another place that it did not intend to have any role in enforcement under Clauses 21 and 22 on fines and informing payment providers and ancillary service providers.
This same message is repeated in the Explanatory Notes:
“The BBFC is expected to be the regulator for the majority of the functions of the regulator (including issuing notices to ISPs to prevent access to material), but is not intended to take on the role of issuing financial penalties and enforcement notices to non-compliant websites”.
This begs an important question to which the Minister must now provide an answer. Who will be the other regulator? It is one thing to have not clarified this at the point of introducing the Bill. It is, however, quite another for the Bill to have passed entirely through one House and be well on its way through another without any update.
In asking this question, I should say that I was very pleased to see that the Government have said that the BBFC will assume the enforcement role in relation to Clause 23, which was introduced on Report in another place. This, however, still leaves questions about the enforcement regulator for Clauses 21 and 22 and how the enforcement regulator in these clauses will interact with the BBFC in its role as “notification” regulator.
In its report on the Bill, the Delegated Powers and Regulatory Reform Committee criticised the lack of information about the regulator, saying:
“The decision as to who to appoint as regulator should be taken before, not after, a Bill is introduced so that it can be fully scrutinised by Parliament. This is especially because the regulator will have important and significant powers conferred by Part 3 which include the ability to impose substantial civil penalties”.
Parliament should know who the enforcement regulator will be, since it will be able to impose these substantial penalties.
Your Lordships’ House should be informed how the enforcement regulator, assuming the Government are still planning on a second regulator, will operate with the BBFC in terms of the mechanics of deciding whether to issue a fine, an enforcement notice, or notice to internet service providers to block certain sites, and how the two regulators will produce consistent guidance for this part of the Bill. Ofcom would be an obvious option for enforcement, but last November it made it clear that it does not want the role. I ask the Minister: who do the Government have in mind? When will he bring that information to the House? I look forward to hearing what he has to say.
My Lords, I apologise to the Committee for not taking part in Second Reading. Having led on the Investigatory Powers Bill and the Policing and Crime Bill I was hoping for some time off for good behaviour, but apparently a policeman’s lot is not a happy one, even when he has retired.
My noble friend Lord Clement-Jones and I have Amendment 55B in this group. The first thing to say is that we on these Benches believe everything that can be demonstrated to be effective should be done to restrict children’s access to adult material online. We also believe that everything should be done to ensure that adults can access websites that contain material that it is legal for them to view. That is why Amendment 55B would require the age-verification regulator to produce an annual report on how effective the measures in the Bill have been in general in reducing the number of children accessing adult material online and how effective each enforcement mechanism has been. We also share the concerns expressed by the noble Baronesses, Lady Jones of Whitchurch and Lady Howe of Idlicote, on these provisions having been made somewhat at the last minute, and that they may not have been completely thought through.
The aims of the Bill and the other amendments in the group are laudable. The ideal that there should be equal protection for children online as there is offline is a good one, but it is almost impossible to achieve through enforcement alone. We have to be realistic about how relatively easy it is to prevent children accessing physical material sold in geographic locations and how relatively difficult, if not impossible, it is to prevent determined children accessing online material on the internet, much of which is free. An increasing proportion of adult material is not commercially produced.
That is not to say that we should not do all we can to prevent underage access to adult material, but we must not mislead by suggesting that doing all we can to prevent access is both necessary and sufficient to prevent children accessing adult material online, the detail of which I will come to in subsequent amendments. Of course internet service providers and ancillary service providers should do all they can to protect children, but there are also issues around freedom of expression that need to be taken into account.
My Lords, first, I thank the Minister for his opening remarks at the beginning of this debate. I was pleased to hear that the Government are in listening mode as we work our way carefully through this Bill.
When we speak about the crucial subject of the enforcement of the age verification provision, it is vital to remember that we are talking about how we ensure that children and young people are kept safe. All the evidence is that early exposure to pornographic material can be extremely harmful to children. The Economist reported that given the view that sexual tastes are formed around puberty,
“ill-timed exposure to unpleasant or bizarre material could cause a lifelong problem”.
As I repeatedly say, childhood lasts a lifetime.
There is evidence that pornography can lead to unrealistic attitudes to sex, damaging impacts on young people’s views of sex and relationships, putting pressure on how they look or influencing them to act in a certain way. All of that reminds us of the context in which we are having these discussions on the finer points of enforcement. With this context in mind, we need to make sure that the age verification provisions in Part 3 are backed up with the most effective means of enforcement.
We have heard noble Lords set out why they think Amendment 66 would be better than Clause 23, but does it really stand scrutiny? There is a concern about the delay that would result from Amendment 66. Quite apart from the fact that requiring the age verification regulator to enforce the age verification requirement through court injunctions would be much slower and much more expensive than the procedure under Clause 23, there is the fact that Amendment 66 would further delay the provision of effective enforcement, and therefore child protection, through the requirement that IP blocking would take effect only if the Secretary of State at some future point decides to make regulations allowing this. In this regard I am particularly concerned that the drafting of proposed new subsection (1) in Amendment 66 implies that the Secretary of State can consider making regulations only when the BBFC considers that there is an actual person in contravention. The BBFC cannot be ahead of the game and will be on the back foot while it waits for the regulations to be made, if they are to be made at all. This does not make our children and young people safer. I am also concerned that Amendment 66 does not provide legal clarity for ISPs at this stage of the Bill on whether IP blocking will be required and, if so, how that will need to be delivered.
While Amendment 66 does not provide certainty, Clause 23 sets out very clearly its central requirement in subsection (2)(c) that an ISP must,
“prevent persons in the United Kingdom from being able to access the offending material using the service it provides”.
It sets out when that would be required in subsection (1), how it would be implemented in subsection (2) and the obligations on the ISPs in subsection (8). The BBFC knows what it can do; the ISPs know what will be expected of them; and the pornographic websites will be clear that their sites might be blocked if they do not comply with Part 3. In comparison with the much weaker Amendment 66, Clause 23 is so effective that exchanging them would fundamentally weaken the child safety provisions in the Bill. That would be a real tragedy.
Why are we making exceptions for porn merchants? We have had a system in place in the UK for dealing with child abuse images for over 20 years. It is the envy of the world. It has never required prior judicial authorisation. Let us be clear: the Internet Watch Foundation, which runs the system, could at any time be brought to court to explain an action or decision it has taken, because it is subject to judicial review. Not only has the IWF never lost a judicial review case, no one has ever taken one against it. We get rid of terrorist material without requiring any judges or courts to get involved and I have never heard any criticism of that system. But if we are talking about protecting children against porn—oh no. Everything slows down, everything becomes more expensive and we have to get a judge and lawyers involved, because it is suggested that, uniquely, we need prior judicial authorisation.
However, the age verification regulator will have an appeals system. Every decision the regulator takes can be made the subject of a judicial review. If the regulator gets taken to court and loses all the time, perhaps we would need to look at the provisions again, but I have absolutely no reason to believe that would be the case. Therefore I think that Amendment 66 should be rejected, because the material we are talking about here is extremely harmful to children and we want it out of sight as quickly and simply as possible. I am sure that no one in this House would want their children or grandchildren ever to be exposed to or damaged by this vile material. Our overworked courts and judges have enough on their plates. We simply do not need to drag them into this on a routine basis. Let us put our children’s well-being and protection first. I very much hope that the Government will stand by Clause 23 and reject Amendment 66.
My Lords, like the noble Baroness, Lady Benjamin, I rise to speak against Amendment 66, which in my judgment would seriously undermine the scope for Part 3 of the Bill to be enforced. I have campaigned for child safety online for many years and am far from reassured that the amendment will deliver on that objective. I have also raised repeatedly concern about the quantity and type of pornography accessed in the UK but based in other jurisdictions. I am very pleased that the Government have recognised that this is a significant issue. However, without being able to ensure that foreign websites take the action that is required under Part 3, in practical terms we will be no further forward.
This is no theoretical discussion. In its evidence to the Public Bill Committee in the other place, the British Board of Film Classification said that it planned to target regulation at about 50 sites and that it does not expect any of these to be in the UK. Clause 21 sets out fines but is far from clear about what the Government can do if a site in another jurisdiction refuses to pay a fine; your Lordships can come back to that when we debate the next group of amendments.
Clause 22 has a better international reach but it fails in a number of different scenarios relevant to the discussion on Clause 23: first, if a site offers free pornography; secondly, if it does not use conventional credit cards but relies on payment methods such as bitcoin; and thirdly, if the website does not use a UK-based ancillary service provider. These very brief statements highlight the need for another enforcement option for foreign websites, and I am pleased that many Members in the other place agreed. I commend the work of Mrs Claire Perry, the honourable Member for Devizes, who had the support of 34 MPs from seven parties for her amendment, which had a similar objective to Clause 23. I also congratulate the Government on responding constructively with the introduction of Clause 23.
For Part 3 to be effectively enforced, it is critical that foreign sites know that the UK regulator could block them. The Digital Policy Alliance, in its briefings on the Bill, said that that there would be a major loophole in the Bill without an IP-blocking option. To this end, the proposals in Amendment 66 are deeply problematic. My noble friend Lord Morrow has already mentioned concerns about delays arising from the need for the Secretary of State to produce regulations and the question of whether he or she will use the power. On top of this, court injunctions are expensive and cumbersome, and every website would know that they could be used only very occasionally, which could tempt foreign sites accessing the UK to risk not bothering with age verification.
I am also concerned that Amendment 66 would undermine the admirable work the Internet Watch Foundation does on removing child abuse images. I understand that if blocking of pornographic and prohibited material should require court injunctions, it will form a very difficult precedent for bodies such as the foundation, which help to keep our children safe. If it had to use a court injunction every time it requested that a page should be taken down, that would greatly limit and inhibit its capacity and as such would be a grave and very serious mistake.
By contrast, Clause 23(1) allows the BBFC to use IP blocking, after notifying the Secretary of State, from the day Part 3 comes into effect. The BBFC may need to use this power early into the Bill’s implementation if it cannot trace a foreign website or if the website is unresponsive and does not use credit card payments, which might be blocked under Clause 22. There will be no delay as to when this enforcement power can be used. Secondly, it will give the BBFC the power to ask ISPs to block sites when they need to. It is not saying that they must use this power but that they can. There will be no delay or the expense of going to court to get a blocking injunction. Thirdly, there will be no negative impacts on the Internet Watch Foundation and the admirable work it does on removing child abuse images.
In a context where the majority of online pornography accessed in the UK comes from websites based in other jurisdictions, the provision of a robust and flexible IP blocking mechanism is central to the ability of this legislation to enforce the age verification provisions that are at its heart to keep our children safe. To swap Clause 23 for Amendment 66 would not reflect well on us. In closing, I warmly congratulate the Government on Clause 23 and hope that they and the Minister will stand resolutely by it and against Amendment 66.
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 9 months ago)
Lords ChamberMy Lords, I have Amendment 69A in this group. Before I discuss that I wish to address a few remarks to the other amendments in the group. I understand the concerns of the noble Lord, Lord Morrow, about enforcing fines on people who are not within the United Kingdom. However, I do not understand how his Amendment 58 would be any more effective if the payment service provider or the ancillary service provider is also outside the UK. Perhaps when he addresses the Committee shortly, he will also indicate to me, because I am a little confused, the difference between his provision in paragraph (a) of proposed new subsection (2) in his Amendment 65, where enforcement of the age verification regulator’s decision on the payment service provider or ancillary service provider is implemented by way of an injunction, and the proposals suggested for a similar process under Amendment 66.
On Amendment 69A, as I mentioned on an earlier group, there are increasing amounts of adult material available on the internet that is not commercial in any sense. Much of it is taken from commercial websites but there is no reference to which website the material has come from, and therefore no suggestion that it is intended as a lure or as providing a link to a commercial site.
To take up issues just raised by my noble friend Lady Benjamin, increasingly there is pornographic material that might be described as “home videos”, either those produced by what might be described as exhibitionists or others where innocent members of the public, including some celebrities in recent years, are deceived into performing sexual acts to their computer camera not knowing that they are being recorded for subsequent posting on to publicly available websites. There is also the issue that Liberal Democrats have been very strong in trying to tackle: those instances of “revenge porn” where disgruntled exes post compromising videos online. From what I can see, that type of material is not covered by the Bill, as there is no commercial aspect and no ancillary services involved. There is confusion about what “ancillary service providers” means. In his remarks on an earlier group of amendments, the Minister talked about pornographers to whom ancillary service providers provide their services. In the case of self-generated or home-grown obscene material, though, there is no pornographer that the website is providing a service to, at least in one sense. Perhaps the Minister will clarify that.
The noble Baroness, Lady Kidron, spoke about the fact that there are some social platforms, such as Facebook and Instagram, which are very good at taking down inappropriate material: they have strict rules about obscene material posted on their platforms. However, there are particular difficulties here with platforms such as Twitter and Tumblr. Although 99% of the content is innocent and of no harm to children, or anyone else, there are Twitter feeds and Tumblr pages that have adult material on them. Those are not simply links to porn sites, but actual videos on the actual pages or Twitter feeds. While most have a warning on the front page—NSFW, or not suitable for work, or 18+ only—that is usually also the page that has already got pornographic images on it. Even on Twitter, it may not be clear that the media content is pornographic until one has accessed those images. Clearly, there is difficulty in enforcing age verification on those platforms when the overwhelming majority of the material contained on them is not adult material.
What I believe needs to be explored is making a tool available to those who want to use social media for adult material, so that when the Tumblr page or Twitter feed is accessed, the user is diverted to a page that warns what lies behind and provides an option to divert away from the adult material. That alternative page could be a government-specified warning about the impact that pornography can have on young people, advising where support can be given and so on: the equivalent to the warning messages that are now printed on cigarette packets, for example. Alternatively, the Government could by regulation insist that such a tool was made available to ensure such a warning page is placed on accounts, as the noble Baroness, Lady Benjamin, mentioned just now, so that people are alerted that such pages or Twitter feeds have adult content on them. It falls short of requiring age verification or blocking such accounts, which I am sure Twitter and Tumblr would resist, but it would still address an important issue.
In its useful briefings on this aspect of the Bill, the NSPCC says there is a particular problem with children who accidentally stumble across adult material. This would go some way to addressing that issue. The NSPCC says a particular problem is pop-up advertisements from commercial pornography sites, which regrettably this amendment does not address—nor is that addressed by any other part of the Bill. Will the Minister tell the Committee whether there is any move by the Government to address that issue?
It is one thing for the BBFC to block a porn site that does not have age verification; it is quite another to suggest—as the Minister said on an earlier group of amendments—that we block a platform such as Twitter, if it fails to do the same for a handful of feeds that contain adult material. I accept that the amendment as drafted is probably far too wide in the powers it gives to the Secretary of State, but it is important that we do not ignore non-commercial adult material, which in increasingly a problem on the internet.
My Lords, my amendment to Clause 17, which noble Lords have already discussed, raised the importance of knowing how the Government plan to enforce the Bill through the appointment of one or more age verification regulators. The amendments tabled by the noble Lord, Lord Morrow, and the noble Baroness, Lady Benjamin, raise similar questions about the mechanics and processes of enforcement and I am very glad to be able to speak in support of Amendments 63, 56, 58 and 65.
On Amendment 63, I agree completely with everything that the noble Baroness, Lady Benjamin, has said. If we are not to have real clarity about the identity of ancillary service providers in the Bill, the idea that we can make do with optional guidance is unsustainable. It must be made mandatory. On Amendment 56, I support the call from the noble Lord, Lord Morrow, to hear a full explanation from the Minister of the mechanisms for enforcing the fining provisions in Clause 22 in other jurisdictions, which were alluded to by the Minister in another place.
In the time available today, however, I would like to focus particularly on Amendments 58 and 65. Any noble Lords who were in your Lordships’ House when we debated the Gambling (Licensing and Advertising) Act 2014 will know that I had a major reservation about the Government’s plans to rely on payment providers to enforce the licensing provisions applying to foreign websites. I think that the noble Lord, Lord Morrow, has demonstrated that my reservations were well founded. In response to written Parliamentary Questions I tabled last year, the Government said that, since the law came into effect in 2014, the Gambling Commission has written to approximately 60 gambling websites reminding them of the law, and payment providers have been asked to block payments 11 times. Given the size of the global online gambling market that can access the UK, that surely seems tiny. If we are supposed to be reassured, I suggest that the Government should think again.
The noble Lord, Lord Morrow, also raised questions about why the Government think that ancillary service providers will act to withdraw their services. I recognise that the Government want to disrupt the business models of pornographic websites, but for some companies, to withdraw their services would be disrupting their own business models. They may be small businesses, not major international organisations such as Visa and Mastercard. In such cases, it would not be in the interests of the business to act. They cannot be expected to do so unless it is made an explicit legal requirement with a clear sanction. My concerns about the absence of any sanction or requirement to act are readily acknowledged by the Government’s own publications, in a manner that I find rather unnerving. In the press release the Government issued when they announced their plans for IP blocking, they said they were,
“also seeking co-operation from other supporting services like servers to crack down on wrongdoers”,
and in the notes to the release said:
“Websites need servers to host them, advertisers to support them, and infrastructure to connect them. With the international and unregulated manner in which the Internet operates we cannot compel supporting services to be denied but the regulator will seek to gain cooperation from the industry”.
They seem to be hoping that, although they have inserted this age verification requirement into statute, it is acceptable to back it up with what is effectively a non-statutory, half-hearted good will enforcement mechanism. Lest anyone doubts this, they should review the Government’s evidence to the Delegated Powers and Regulatory Reform Committee about the delegated powers in the Bill. The Government reported on the guidance to be issued under Clause 22(7) about who will be given a notice about non-compliance of pornographic websites. Importantly, the Government said:
“The recipients of those notices can decide whether or not to take action. Accordingly it is considered that no Parliamentary procedure is necessary”.
It seems that the Government hope that by placing the obligation for age verification in statute, we will congratulate them on fulfilling their election manifesto commitment, without—at least as far as Clause 22 is concerned—any credible commitment to enforcement.
My Lords, first I thank the noble Lord, Lord Browne, for supporting my amendment in the last group about proportionality and the order in which websites should be tackled. Moving on to this group, I spoke to this set of amendments when we addressed this issue in the group starting with Amendment 54B—so I can abbreviate my speech and be quick. I support the noble Lord, Lord Browne, on the point made in the part of the briefing he was reading about the Obscene Publications Act and the Crown Prosecution Service advice et cetera being out of step with each other and out of step with enough members of the public for it to matter—that is the real trouble. I had thought to mention one or two of the unsavoury practices that you might find that will not be classified under the current ruling in Clause 23, but I think I have been trumped by the newspapers.
Some in the BBFC probably see this as an opportunity to clean up the internet. But it is not going to work; it will have the reverse effect. This whole issue of what is prohibited material needs to be tackled in another Bill, with a different regulator or enforcer, so it does not get confused with the business of protecting children, which is the purpose of this Bill. It will not protect children anyway, as this material ought to be behind the age verification firewall in any event. In fact, the noble Lord, Lord Browne, pointed out why it might not be: you have a possible lacuna in the Bill. If you say that the material is stuff that the BBFC has classified, the really nasty stuff is not included, because it is not able to be classified—so suddenly Clause 23 might not apply to it. He is absolutely right there. This is one of the dangers, which is why they are having to try and draw in the idea of prohibited material. It would be much easier to remove prohibited material altogether.
It has been suggested to me that the easiest thing would be to alter Clause 16, which deals with the definition of pornography. Instead of having this very limited scope, it would be much easier just to have the one simple definition which is already in Clause 16(1)(e)(i), but with the wording slightly expanded to say, “Without prejudice to the application of the Obscene Publications Act 1959, any material produced solely or principally for the purposes of sexual arousal”. You could leave it at that, and then you would protect children from anything unsavoury that we do not want them to see. That is a much simpler solution than getting into this terribly complicated debate about what is prohibited material.
My Lords, I very much share the concerns expressed by the noble Lord, Lord Browne, about this set of amendments and prohibited material. As they stand, the amendments would have the effect of causing the Bill to place 18 and R18 material behind age verification checks, which Clause 16 limits to 18 and R18 material, while prohibited material would be freely available without any such protection. This would be pretty irresponsible and would show no regard for child protection. Even if the Bill was amended so that prohibited material was only legal online if placed behind age verification checks, we should not forget that the important strategy of targeting the biggest 50 pornography sites will not create a world in which children are free from accessing prohibited material, so that adults can relax and access it without concern. Even if the material was made legal online and given a BBFC classification, this would give a measure of respectability in the context of which it would no doubt become more widely available, and thus the chances of children seeing it would be further exacerbated.
Moreover, the crucial point is that we cannot make prohibited material legal in an online environment at the same time as maintaining the category of prohibited material offline. The former would inevitably result in the latter. Mindful of this, and of the fact that the category of prohibited material is long established, it would be wholly inappropriate for the House or indeed the Government to simply end the category of prohibited material online without a major public consultation. I very much hope that the Minister will completely reject these amendments and stand by what he said on this matter at Second Reading.
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 8 months ago)
Lords ChamberMy Lords, it seems odd in a society such as ours that we are even thinking about how to give access to violent pornography or trying to mitigate it in some way. It seems clear to me that most of us sitting in this House probably have less idea of how online digital communications work than a five year-old. Children—my grandchildren’s generation —are very adept and almost intuit how to do this stuff. The technology is advancing so quickly—more quickly than we can imagine—and you can bet your life that many of our children will find ways around it more quickly than we can set down laws. What is online ought to be held at least to the standard of what is appropriate for offline, because it is online that children, as well as young people and adults, will access this stuff, and it is too easy. If the higher standard applies to offline, surely it ought to be maintained for online communications. Otherwise, we are saying that this is acceptable for the common good and that it represents an acceptable anthropology—our understanding of what a human being is—in which we are happy to normalise violence, the commodification of people and sex, and even the exploitation, not just for sexual purposes but for commercial profit, of something that ought to be held in higher regard.
My Lords, I spoke on the subject of prohibited material in Committee and I rise to do so again. In Committee, I raised concerns that if the Digital Economy Bill was amended so that prohibited material could be supplied if placed behind age-verification checks, children were still likely to see this material because the Government have made clear that they are expecting a “proportionate enforcement” targeting the biggest pornography sites—likely to be the top 50 to start with—so we are not creating a world in which children are safe from accessing prohibited material. They will be safer, yes, but not completely safe.
That is the sad effect of government amendments to Clause 16. If they are accepted, it will become acceptable for a website to supply any material so long as it is behind age verification, unless it falls within the very narrow definition of extreme pornography. By doing so, we are giving violent and abusive material a large boost of respectability, as we do not allow supply of the same material via DVDs or UK-based video on demand.
In this context, the fact that the legislation defining prohibited material remains in place does not make these amendments more acceptable. It simply presents a very awkward question for the Government. Why do they not want to enforce the standards set by these laws? The decision to go to the lengths of asking us to change the Bill so that most of the laws that make up prohibited material will not be enforced cannot but send the message that in some ways we regard this as acceptable. How does changing the Bill today to allow pornographic violence that allows injury to the breasts, anus and genitals so long as it is not serious, and serious injury to any other body part, do anything other than normalise violence against women? How is this consistent with the Government’s other messaging on violence against women?
The other government argument—that the CPS will still retain the discretion to prosecute—borders on the absurd. As everyone knows, the vast majority of online porn accessed in the UK comes from websites based in other jurisdictions that cannot be easily reached by our courts. That is the whole point of creating an age verification regulator with the enforcement powers in Clauses 22 and 23, which do not depend on getting errant websites in Russia into court. I am especially concerned that this material will include some images of children. The origins of this part of the Bill were, after all, to protect children. I know that the Internet Watch Foundation has a very effective role in working with internet service providers on photographs and pseudo-photographs of children. However, I am troubled because there is no agreement around the world about the ethics of animated pornographic images of children. The IWF’s role on animated images is restricted to images hosted in the UK.
My Lords, I support Amendment 25YR and will speak to Amendment 33A, which is in my name. We certainly need to look much more closely at the duty of online providers and their responsibilities. Amendment 25YR refers to the overarching duty of care that agencies must have to children. Both amendments address the need to oblige these online providers to report incidents on content that are likely to contravene existing regulations and likely to come up to the criminal test as used in prosecutions.
The obligations also include that the content should be removed with immediate effect. As we have already heard, this has proved difficult in many cases and very many people say that they have tried to have offensive material removed unsuccessfully. Amendment 25YR refers to a code of practice, and mentions that it needs specific terms that prohibit cyberbullying and provide a mechanism for complaints, as well as for the removal of the offending material. The other thing I particularly welcome about this amendment is the obligation to work with educators, technical professionals and parents to ensure that young people have safe use of the internet.
Amendment 33A would extend this principle rather wider. I am sure we all support measures to prevent cyberbullying of children. It is also fair to say that it is not just children who suffer in this way. Many members of minority groups, disabled people and people with learning difficulties—in fact, people who are in some way different—come in for regular forms of abuse. People just like you and me, having disagreed with somebody, come in for torrents of vile, unpleasant and absolutely unacceptable bullying on the internet. I believe that this would not be allowed in newspapers. Somebody would not be allowed to abuse someone else in a pub. The landlord would be responsible and I believe that it is time we took the online providers to task and made them take some responsibility for what appears and what they allow.
The Minister, in replying to my amendment last time mentioned the fact that existing legislation already provides the means to do this. In fact, I think over 30 statutes refer to these measures and have not yet been consolidated—added to which, there are laws coming online that will make it even more difficult to have a consolidated approach, such as the revenge porn legislation and law on streaming of child abuse. It is becoming increasingly complex and we need a much firmer approach.
It was also mentioned that the Home Office had £4.5 million to address this issue; I understand that this was largely for the measures and resources that the police needed to prosecute criminal acts in this way. The last thing the Minister referred to was that the Law Commission was consulting on this issue. My understanding of that consultation is that it is about improving people’s behaviour on the internet. It does not at all address the online providers. This Bill offers an opportunity to address an appalling practice that is becoming even more prevalent, and I hope that the Minister will agree to incorporate these amendments in the Bill.
My Lords, I am very happy to support the amendment—to which I have added my name—which would bring in a statutory code of practice for media platforms with the important aim of preventing online abuse.
As I said earlier, Part 3 is a child protection measure. Young people use social media. The 2016 Ofcom children and media report devoted an entire chapter to YouTube, social media and online gaming. Around 72% of 12 to 15 year-olds have a social media profile, with Facebook being their main social media profile, and three in 10 of these 12 to 15 year-olds visit their social media account more than 10 times a day. In the last few weeks we have heard about Facebook not taking down child sexual abuse images. Last week, the Home Affairs Select Committee in the other place grilled representatives of Google, Facebook and Twitter on their response to online abuse and hate crime as part of their inquiry into hate crime and its violent consequences.
This amendment is in line with the Government’s objectives to keep children safe. I am expecting the Government to come back and tell us that the UK Council for Child Internet Safety produced guidance for social media sites in 2015, entitled Child Safety Online: A Practical Guide for Providers of Social Media and Interactive Services, and that therefore this code of practice is just not needed. While I commend the good work of UKCCIS, the news of the last few weeks leaves me convinced that without a statutory code we are not doing enough to protect children and to support parents. Parents have to navigate completely new technological terrains. They have no reassurance that there are consistent standards across social media sites, nor what they are. Last year a third of parents said they were concerned about their child being the subject of cyberbullying. Part of the requirements of the code would ensure that social media sites worked with,
“education professionals, parents and charities to give young people the skills to use social media safely”.
I fully support this initiative. Ofcom reports that 52% of parents of eight to 11 year-olds and 66% of parents of 12 to 15 year-olds talked to their children about cyberbullying. This is encouraging, but how much more encouraging if parents know that if they talk to their child about Facebook, the same rules apply on other social media sites and vice versa.
We expect to make our children safe in the physical spaces they occupy every day and have no hesitation in using the law to do so. We need to be doing the same online so I fully support Amendment 25YR to introduce a statutory code of practice for social media platforms.
My Lords, I support the amendment proposed by the noble Baronesses, Lady Jones and Lady Janke, but also the remarks of my noble friend Lady Howe. I want to ask the Minister, when he comes to reply, about an issue that I raised in your Lordships’ House previously, and that is the issue of suicide sites on the internet. It concerns me that young people can be encouraged to visit those sites and take their own lives. Only a year ago I attended a school prize giving in a north-west school, and the headmaster told me when I arrived how a child in that school had taken their own life only the day before. As noble Lords can imagine, that was a terrible tragedy not only for the family but for the whole school, and it rather changed the atmosphere on that occasion. That child had been visiting one of the suicide sites on the internet, and the headmaster discovered that several other children had been doing the same.
It can be revenge porn or the kind of trolling to which the noble Baroness referred, the harassment of young women in particular, or the whipping up of xenophobia, racism or anti-Semitism, but it is right that there should be a code of practice, and we should get on with it. I hope that the Minister will tell us more about the Green Paper, what the framework will be for it and when we are going to start to look at these issues seriously.
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 7 months ago)
Lords ChamberMy Lords, briefly, I very much support this amendment and above all salute the work of the noble Baroness, Lady Benjamin, for all she has done over many years in making the case for the production of more and very much better-quality television programmes for children, whether by the BBC or other programme-makers. It is very good to see the name of the Minister on this amendment and I hope I am not wrong that as a result the Government fully support it. I hope we shall hear that soon.
My Lords, I congratulate the noble Baroness, Lady Benjamin, on her continuous hard work on this issue. We also added a name to the amendment in Committee and here today. I very much share in her delight and happiness that progress has finally been made. As the noble Baroness said, this is effectively an enabling amendment for Ofcom. I hope that it will not just sit on the statute book; we look now for action to follow it through. As the noble Baroness said, there is already sufficient evidence, which Ofcom has, of the huge decline and reduction in children’s TV. There is no need for a pause while Ofcom finds evidence as to whether it needs to act. The evidence is already there. I hope that when Ofcom comes to consider the new powers we are providing, it will feel able to act straightaway. I hope that the Minister can reassure us that she will encourage Ofcom to do just that, and that this will not just sit there as an enabling power but is something the Government will encourage Ofcom to act upon. Again, I look forward to the Minister’s response.
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 7 months ago)
Lords ChamberMy Lords, I rise to speak to my Amendment 33ZLA on adult content filters. After all the lengthy discussions about age verification, some might be tempted to think that filters have been overtaken and eclipsed by age verification checks. However, that is not the case. The age verification checks in Part 3 relate narrowly to pornography and not to other non-pornographic adult content. This leaves out any protections in Part 3 on violence, self-harm, gambling and so on. In another place there was a debate about extending age verification checks to other forms of adult content and this is something that I think is worthy of further consideration, perhaps in the forthcoming Green Paper on internet safety.
In the short term, however, it seems to me that we should make better use of adult content filters. The Government have asked Ofcom to produce a series of reports on the filtering provisions and practices of the four largest ISPs. These reports have helpfully provided objective analysis of the way each of the four ISPs have approached adult content filters, the standards to which they have subscribed and the extent to which customers have used them. This information has been very useful for policymakers and parents. If we concede that it is important to understand what ISPs are doing in relation to adult content filters, however, it simply makes no sense to look only at the conduct of some ISPs. Indeed, if Ofcom was only going to look at the conduct of some ISPs, it would make more sense for it to shine the spotlight on the conduct of the smaller ISPs as they are not party to the family-friendly filtering agreement between the big four ISPs.
There is no public clarity about the conduct of smaller ISPs in terms of whether or not they provide adult content filtering options, how they provide these options or what filtering standards they apply. Far from making for transparency, this generates confusion for both parents and policymakers. My amendment would end this very unsatisfactory state of affairs and require Ofcom to assess the conduct of all ISPs in relation to adult content filters.
In making this argument, I am mindful that some have suggested that the smaller ISPs primarily service businesses rather than homes, which might cause them to conclude that it is not relevant to assess their conduct in relation to adult content filters. In the first instance, even if it were true that the smaller ISPs primarily service businesses, to the degree that they would not do this exclusively and would also service homes, there would be a clear need to assess their conduct in relation to adult content filters. After all, every child matters.
Secondly, and more importantly, while I certainly acknowledge that some small ISPs such as Claranet focus only on business customers, that is not the case for others such as KCOM, the Post Office and Plusnet. There is a sense in which the different assessment as to whether the smaller ISPs service businesses or homes highlights all too well the lack of clarity about the smaller ISPs, demonstrating the need to ask Ofcom to review their conduct in relation to adult content filters, as well as that of TalkTalk, Sky, Virgin and BT. I believe in transparency, and that we particularly need greater transparency in relation to the conduct of the smaller ISPs. This will serve two important ends. In the first instance, it will help service a clearer public policy debate about child safety online and on the role of filters, which I believe would greatly assist the Green Paper process. In the second instance, the data gathered could be made available to help parents wanting to have a good objective understanding from an official source of the kind of filtering options that an ISP provides and of the filtering standards to which it subscribes. This would help empower parents as they seek to rise to the challenge of helping to keep their children safer in a digital age.
In closing, I thank the Minister for meeting me to discuss the conduct of the smaller ISPs and for the conversations that he had subsequently about the approach of smaller ISPs with the Internet Service Providers’ Association. I very much welcome the fact that ISPA has now agreed to introduce a new step in its members’ sign-up process, which requires members to consider whether online safety tools are suitable for their customers. This provision, together with my amendment, would certainly help to move things forward. I beg to move.
My Lords, I thank all noble Lords who have contributed to the debate. I will start by saying that the noble Baroness, Lady Howe, has been a consistently strong voice in this House in favour of protecting children online and we pay tribute to that. As noble Lords know, we introduced Clause 91 in Committee on the provision of family-friendly filters, clarifying that internet service providers may restrict access to information, content, applications or services where that is in accordance with the terms of service agreed by the end user. That clause gives a reassurance to providers that such filters are compliant with EU net neutrality regulations, so the debate on that has been had in this Bill.
The noble Lord, Lord Collins, my noble friend Lord McColl and the noble Baroness, Lady Benjamin, referred to the report of the House of Lords Communications Committee, Growing up with the internet, which was published on 21 March. The noble Baroness, Lady Benjamin, hopes that we will take careful note of it. She knows that we listen to her—she had an amendment accepted. Among the many recommendations in the report, there is a call for a mandatory default on filters set to a minimum standard to be a requirement made of all ISPs and mobile network operators. Of course I can confirm that we will consider the recommendations in the report carefully as part of our developing work on the new internet safety strategy, and we will respond to it formally in due course.
However, we believe that the current voluntary approach on filters works well and that a mandatory approach would run the risk of replacing the current user-friendly parental control tools with a more inflexible top-down system. As has been noted by several noble Lords, the Internet Service Providers’ Association, the trade body for the industry, is taking further action to encourage smaller ISPs to consider online safety issues and parental control filters for their customers where appropriate. But having said that, I can make the commitment that we will listen to what the committee has said on this subject and, as I say, we will respond in due course. This amendment would require Ofcom to report to the Secretary of State every two years on the number of internet access providers which do or do not offer filters and to describe the actions being undertaken by them in relation to child protection.
As noble Lords will know, in 2013 the previous Prime Minister announced our agreement with the big four ISPs—Sky, Virgin Media, BT and TalkTalk—that they would offer network-level family filters to all customers by the end of December 2014. Ofcom was asked to produce reports on this rollout and did so in four reports issued between January 2014 and December 2015 covering the detail on the provision of filters and child protection measures by the big four ISPs, covering 88% of the fixed broadband market. The vast majority of consumer-focused broadband is therefore a matter of public record. The Ofcom reports also cover data on take-up and usage by parents of these filters. The data are now updated annually in Ofcom’s Children and Parents: Media Use and Attitudes reports, which provide statistics on parental usage and awareness of filters and experience of online safety. In respect of ISPs other than the big four, which run into hundreds, the vast majority of these are SMEs and micro-businesses, as noble Lords may be aware, offering niche, specialist and business-to-business services to small subscriber bases.
With that in mind, it is not clear from the amendment how Ofcom would gather the information it would need to prepare the statutory reports. It is likely that Ofcom would need to identify and ask providers for this information. This would be a very big task for Ofcom as ISPs enter and leave the market constantly and there is no requirement for them to register with Ofcom. It would also be disproportionate for the majority of ISPs, most of which are not focused on the mainstream consumer market, to be asked to provide this information.
The information covered by the existing Ofcom reporting ensures that the most relevant data are sourced on the actual usage of filters by parents, without disproportionate costs or impact on SMEs and micro-businesses. A statutory approach could also unnecessarily limit the scope and focus of reporting moving forward, as technology and the market changes.
On that basis, we consider it more appropriate for Ofcom’s reporting to be on a non-statutory basis to allow greater flexibility. Therefore, I hope that in light of that the noble Baroness will withdraw her amendment.
My Lords, I am most grateful to all noble Lords who have taken part in this debate and raised all these extremely important issues, and to the Minister for setting out his views on what has been achieved and some of what he considers the danger of asking Ofcom to do rather more than at present, therefore perhaps limiting some of the other work. I would certainly like to see rather more progress being achieved, but on the other hand I understand the extent to which steps have been taken. In the circumstances I will not press the amendment further, but I hope that the Minister will keep the whole issue under review and let us know as and when he becomes even more satisfied with what has been achieved, remembering that at the back of all this it is the small users, such as the parents and children, who we are really concerned about protecting. Having said that, I will withdraw my amendment.
Baroness Howe of Idlicote
Main Page: Baroness Howe of Idlicote (Crossbench - Life peer)(7 years, 7 months ago)
Lords ChamberMy Lords, this is a group of technical amendments to ensure that the legislation is as clear and consistent as possible.
Amendment 2 removes Clause 10, which creates a new power for the Secretary of State to set a statement of strategic priorities relating to the management of radio spectrum. On Report, Clause 104 was introduced, expanding this power to cover telecommunications and postal services, in addition to the management of radio spectrum. The introduction of this new provision means that Clause 10 is no longer necessary. I promised on Report to introduce this amendment at Third Reading.
Amendments 3 to 8 relate to the measures for age verification for online pornography. Amendments 3 and 6 remove clarificatory wording on,
“a means of accessing the internet”,
from Clause 16 and put it in Clause 23. Due to an earlier amendment, that phrase is no longer used in Clause 16 but it is still used in Clause 23, so the definition is moved to Clause 23.
Amendment 4 is one for aficionados of parliamentary drafting. It ensures that the Bill is consistent by aligning the wording of Clause 19(7)(a), which refers to,
“the House of Commons and the House of Lords”,
with the wording of Clause 27(13)(a), which refers to “each House of Parliament”. I think we will all sleep easier at night if that is consistent.
Amendment 5 clarifies that the regulator’s power to require information can be from internet service providers and any other person that the age-verification regulator believes to be involved, or to have been involved, in making pornographic material available on the internet on a commercial basis to persons in the United Kingdom.
Amendments 7 and 8 amend the definition of “video works authority” for the purposes of Clause 24, so that this includes the authority designated in respect of video games. This follows the approach to the extreme pornographic material provisions of the Criminal Justice and Immigration Act 2008.
Amendment 9 removes the provision for transitional, transitory and saving provisions in relation to the repeal of Section 73 of the Copyright, Designs and Patents Act 1988. This is a technical drafting amendment to ensure consistency between this clause and Clause 122 on commencement. I can confirm again to the House that Section 73 will be repealed without a transition period and that the Government will commence repeal without delay.
Turning to Amendment 12, I am very grateful to the noble Baroness, Lady Drake, for drawing my attention on Report to the need for complete clarity as to whom the Government are referring in the undertaking to be transferred from BT plc to a future Openreach Ltd. I accepted that a clear definition of the term “undertaking” was necessary and offered to come back with a government amendment at Third Reading to address this issue. Government Amendment 12 does this, making it clear that we define the term “undertaking” to include anything that may be the subject of a transfer or service provision change, whether or not the Transfer of Undertakings (Protection of Employment) Regulations —TUPE—apply. The intention is that all employees currently benefiting from the Crown guarantee will continue to do so if they transfer to Openreach Ltd. The Government consulted on the wording in advance of laying this technical amendment. I am grateful to the noble Baroness for assisting us, and to both BT plc and the trustee for confirming that this definition was satisfactory.
Amendments 13 to 17 relate to the Electronic Communications Code. Under the new code, an owner or occupier whose access to their land is obstructed by electronic communications apparatus without their agreement has the right to require the removal of that apparatus. Amendments 13 and 14 make it clear that this right arises only where the apparatus itself interferes with access, as opposed, for example, to a temporary obstruction by a vehicle.
Amendments 15, 16 and 17 merely correct minor omissions and referencing errors. I beg to move.
My Lords, I welcome these tidying-up amendments. I want to take the opportunity provided by this Third Reading debate to congratulate the Government once again on taking action to protect children from pornography on the internet through age verification. I shall be watching the implementation of Part 3 of the Bill closely. I would like also to put on record my thanks to the Minister for meeting with me to discuss adult content filters. I am very grateful also to noble Lords who supported my amendment at an earlier stage, highlighting the need to get a better understanding of the adult-content filtering approaches adopted by smaller ISPs that service homes with children: the noble Lords, Lord Collins of Highbury and Lord McColl of Dulwich, and the noble Baroness, Lady Benjamin.
Turning to the future, I am very much looking forward to the discussions on the Government’s Green Paper on internet safety and to their response to the Communications Committee’s report, Growing up with the Internet. Part 3 of this Bill is not the end of the story on children and internet safety.
Despite many positives, in comparing and contrasting the Bill that entered your Lordships’ House with the Bill as it now leaves, my response is one of sadness. The underlying principle of parity of content has been removed and the Bill is, in this respect, unquestionably weaker as a result.
In the first instance, the Bill entered your Lordships’ House properly applying the same adult content standard online as applied offline. It leaves your Lordships’ House saying that most material that the law does not accommodate for adults offline will be accommodated online behind age verification. Only the most violent pornography—that which is life-threatening or likely to result in severe injury to breast, anus and genitals—will be caught. Injury or severe injury to other parts of the body appear to be fine as long as they are not life-threatening. As the Bill leaves us, the message goes out loud and clear that violence against women—unless it is “grotesque”, to quote what the Minister said on Report—is, in some senses, acceptable.
In the second instance, the Bill entered your Lordships’ House properly applying the standard of zero tolerance to child sex abuse images, including non-photographic and animated child sex abuse images. Today it leaves your Lordship’s House with the relevant powers of the regulator deleted so that it can no longer take enforcement action against animated child sex abuse images that fall under the Coroners and Justice Act 2009. As such, the Bill goes out from us today proclaiming that non-photographic images of child sex abuse, including animated images, are worthy of accommodation as long as they are behind age verification.
As agreed, Third Reading is a time for tidying up. However, Part 3 of the Bill clearly requires further amendment so that the message can go out once again—as it did in the other place—that there is no place for normalising violence against women and no place for accommodating any form of child sex abuse. I hope that the other place will now rise to that challenge.
My Lords, I do not wish to detain the House unduly on these amendments. I welcome, in particular, Amendment 9 as it is the fulfilment of a pledge made by the Minister on Report. I am delighted that Section 73 of the Copyright, Designs and Patents Act will be no more as soon as the Bill comes into effect. I am delighted that the Minister has fulfilled his undertaking.