Earl of Erroll
Main Page: Earl of Erroll (Crossbench - Excepted Hereditary)(7 years, 10 months ago)
Lords ChamberMy Lords, I am very reluctant to take part in this debate, because I was not available to speak at Second Reading, which always restrains noble Lords from speaking in Committee. However, I will make three points.
First, I confess openly that I have indulged in sexual activity—I will not say when, as that might be unfair. But I have never fired a gun or a revolver in anger, or taken part in a fight with a knife, or indeed taken part in a fight at all. Yet we are not banning scenes of violence, even on the news, which are seen by children all the time, whereas we are involved in banning scenes of sexual activity. That may be right, but we ought to be looking at other areas of life as well, because they can damage children just as much as sexual activity can.
Secondly, this law as it stands—many noble Lords who have moved or spoken to amendments have admitted this—is almost inoperable. It cannot be enforced—or can be enforced only on rare occasions. That is rather like speeding in your motor car, which is an analogy I have used before. Everybody breaks the law by speeding—or most people do—because they know that they will not get caught. That is rather like this law, as it stands at present. The problem with unenforceable or rarely enforced laws is that they bring the law into disrepute—and that is the danger of this part of the Bill as it stands. We are in danger of bringing in something that is not enforceable and, by doing so, we are bringing the law itself into disrepute.
Lastly, I will give my solution to all of that. The aim of this part of the Bill is not to stop pornography sites but to stop children watching them. There is a simple answer to that—but, unfortunately, it is an answer that the Liberal party do not support and which the Tory Government got rid of when we introduced the voluntary part of it. It is an identity card. If you introduce a mechanism whereby you can get into pornography sites on any device only by using your fingerprint or via eye recognition, or whatever it might be, of course that can stop it. On my iPad I already have a device by which I can save my passwords and which will show them to me when I want to use them. But I can get into it only by using my fingerprint; I cannot do it any other way. I cannot even use my normal four-digit pass code; I can do it only with my fingerprint. Why not do that sort of thing for pornography sites as well? Only adults will be able to get into them; children will be barred by the introduction of an ID card mechanism, so that you can get into it only by that means. Unfortunately I have hospital appointments during the next sitting of Committee, but I hope that on Report I will be able to introduce amendments to that effect.
My Lords, I have one amendment in this group. I very much support Amendment 65, but there is no point adding anything to what the noble Lord, Lord Morrow, said. He covered it in great detail and for all the right reasons. I will add only for the noble Lord, Lord Paddick, that a lot of the payment service providers—this is the key to it—such as Mastercard, Visa, and so on, are international. If there is a duty on them, they are very good at trying to stick to the law. That would close quite a few holes and make life a bit difficult for sites—so as a deterrent, it would really help.
Sadly, this whole approach to cutting off the ancillary service providers years ago was enough to kill off pirate radio in the 1960s—which I was very sad about. But this time I approve of being able to do it, because I approve of the motive behind it: trying to stop children accessing pornography.
Amendment 68B, in my name, questions what a “large number” of children is. I realise that it is obvious that you have to prioritise, because 80% of the sites are over a certain size and they will definitely come under this. They handle 80% or so of the traffic, or whatever, so I can see that you should check up on them first. But they are also the ones that will comply, because many of them are onside anyway. However, let us say that there are 10% of sites left. That is an awful lot of children, if you do the maths in your head. You knock one nought off the end of however many children there are, but you still leave an awful lot. I therefore do not understand why we are leaving in a “large number” as a constant target. There must come a point when it is worth moving on to the smaller numbers as well. I therefore do not understand the purpose of the clause. It is self-evident that they will have to prioritise. If they do not, they are idiots—and I know perfectly well that the members of the BBFC are not. Therefore I cannot understand the purpose of it.
Amendment 69A, in the name of the noble Lord, Lord Paddick, has some merit in it. As the noble Baroness, Lady Benjamin, said, there is a lot of non-commercial stuff out there. The purpose of this is to stop children viewing pornography. It does not matter whether it is commercial or not. If you put in something like this, there are clever ways in which people will try to define their sites as non-commercial. In particular, if they can start appealing against this—this is where having a complicated appeals process would become so dangerous—I can see loopholes opening up. So we need to start including non-commercial pornography—and it is okay if it takes a year.
I also support Amendment 237, in the name of the noble Baroness, Lady Benjamin. We need to have a deadline. It is something that all sites can work towards. We should say that, on whatever date, if sites are not compliant—we suggest that it ought to be a bit like a speed limit, where you ought to slow down before you hit the 30 miles per hour limit—we will issue notices to the ISPs to block them. Something might happen, because you have a level playing field, everything happens on the same date, and under the amendment in the name of the noble Baroness, Lady Benjamin, they will have a year to do it in. That is probably enough to get your regulations in place and so on. It is a very good idea.
My Lords, I am pleased to be able to support Part 3 of the Bill, and Amendments 58 and 65, in their objective of increasing child safety. However, I am concerned that the Government’s proposal in Clause 22 currently leaves many questions unanswered. I am raising these points in the context of the Government stating in the impact assessment for the Bill, published last May, that the regulatory system to be set up under Clause 22 would merely,
“nudge porn providers to comply and put age verification in place”.
That is not consistent with the much bolder manifesto commitment simply to,
“stop children’s exposure to harmful sexualised content online by requiring age verification for access to all sites containing pornographic material”.
Since then the Government have set out a robust position on IP blocking, which leaves websites little room for doubt as to what might happen if they do not comply with Part 3. The enforcement action is clear: the age verification regulator can issue a notice and internet service providers have a duty to respond. In this regard, and alluding back to the previous debate, I think it is vital that Clause 23 should remain as it is—unamended.
However, there has been no upgrading of Clause 22 in parallel with the introduction of Clause 23, so we are left with the notion of “nudging” websites—which gives me little reassurance that this is a robust approach to enforcement. Under Clause 22(1) the age verification regulator may give a notice to a payment provider or an ancillary service provider, but it is not clear when or if the regulator would inform the service provider that such a contravention was happening. Would it be after a fine was not paid or after a letter had been sent—and, if so, how long would a website have to respond before a notice would be given? I hope that the Minister will set out the Government’s intentions.
I support Amendment 58, tabled in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Howe. It would require the regulator to issue a notice under Clause 22(1). The noble Baroness deserves much credit for her persistence in bringing this issue before your Lordships’ House over many years. My bigger concern is that, having set out clearly that internet service providers must act in response to a notice from the regulator, there is no transparent statutory expectation on payment providers or ancillary service providers. How do the Government expect enforcement to take place without this power? Others have set out their case on this point in detail and I will not take up the time of the Committee by repeating it, but I am left feeling concerned that there is no power to require service providers to take any action after receiving a notice from the regulator. Furthermore, such a lack of teeth undermines the Government’s manifesto commitment to prevent children accessing all pornographic websites.
I fully support Amendment 65 in the group, which would make it a duty for payment providers and ancillary service providers to act by removing their services from contravening websites, and makes that duty enforceable. I hope that the Government will agree.
My Lords, time is somewhat against us this afternoon. I will be extremely brief. I pass no judgment on where the line should be drawn. I say simply that it is an unassailable argument that it should be drawn in the same place offline and online. Well before the internet of things arrives, the internet is already regarded as a method of distribution of DVDs, CDs and books, so it would be entirely illogical to have one rule offline and not implement it online.
My Lords, first I thank the noble Lord, Lord Browne, for supporting my amendment in the last group about proportionality and the order in which websites should be tackled. Moving on to this group, I spoke to this set of amendments when we addressed this issue in the group starting with Amendment 54B—so I can abbreviate my speech and be quick. I support the noble Lord, Lord Browne, on the point made in the part of the briefing he was reading about the Obscene Publications Act and the Crown Prosecution Service advice et cetera being out of step with each other and out of step with enough members of the public for it to matter—that is the real trouble. I had thought to mention one or two of the unsavoury practices that you might find that will not be classified under the current ruling in Clause 23, but I think I have been trumped by the newspapers.
Some in the BBFC probably see this as an opportunity to clean up the internet. But it is not going to work; it will have the reverse effect. This whole issue of what is prohibited material needs to be tackled in another Bill, with a different regulator or enforcer, so it does not get confused with the business of protecting children, which is the purpose of this Bill. It will not protect children anyway, as this material ought to be behind the age verification firewall in any event. In fact, the noble Lord, Lord Browne, pointed out why it might not be: you have a possible lacuna in the Bill. If you say that the material is stuff that the BBFC has classified, the really nasty stuff is not included, because it is not able to be classified—so suddenly Clause 23 might not apply to it. He is absolutely right there. This is one of the dangers, which is why they are having to try and draw in the idea of prohibited material. It would be much easier to remove prohibited material altogether.
It has been suggested to me that the easiest thing would be to alter Clause 16, which deals with the definition of pornography. Instead of having this very limited scope, it would be much easier just to have the one simple definition which is already in Clause 16(1)(e)(i), but with the wording slightly expanded to say, “Without prejudice to the application of the Obscene Publications Act 1959, any material produced solely or principally for the purposes of sexual arousal”. You could leave it at that, and then you would protect children from anything unsavoury that we do not want them to see. That is a much simpler solution than getting into this terribly complicated debate about what is prohibited material.
My Lords, I very much share the concerns expressed by the noble Lord, Lord Browne, about this set of amendments and prohibited material. As they stand, the amendments would have the effect of causing the Bill to place 18 and R18 material behind age verification checks, which Clause 16 limits to 18 and R18 material, while prohibited material would be freely available without any such protection. This would be pretty irresponsible and would show no regard for child protection. Even if the Bill was amended so that prohibited material was only legal online if placed behind age verification checks, we should not forget that the important strategy of targeting the biggest 50 pornography sites will not create a world in which children are free from accessing prohibited material, so that adults can relax and access it without concern. Even if the material was made legal online and given a BBFC classification, this would give a measure of respectability in the context of which it would no doubt become more widely available, and thus the chances of children seeing it would be further exacerbated.
Moreover, the crucial point is that we cannot make prohibited material legal in an online environment at the same time as maintaining the category of prohibited material offline. The former would inevitably result in the latter. Mindful of this, and of the fact that the category of prohibited material is long established, it would be wholly inappropriate for the House or indeed the Government to simply end the category of prohibited material online without a major public consultation. I very much hope that the Minister will completely reject these amendments and stand by what he said on this matter at Second Reading.
The amendment is in my name and that of my noble friend Lord Clement-Jones and the noble Baroness, Lady Jones of Whitchurch. I have to say that it is only because we were quicker on the draw that I am leading on this amendment rather than the noble Baroness.
As I have previously alluded to, we believe that age verification is not sufficient protection for children on the internet. It can easily be circumvented, and it would be very difficult to place age verification on such platforms as Twitter and Tumblr. In relying on this mechanism, there is a danger of diverting attention away from other important and effective methods of addressing the issue of children accessing adult material online. Despite our misgivings, we believe that everything should be done to protect the privacy of those who have their age verified to enable them to access adult material on the internet. I am grateful to the Open Rights Group for its briefing and suggested amendment on this issue, which is the wording we have used for our amendment.
Age verification systems almost inevitably involve creating databases of those who are accessing adult material. It is completely lawful for those who wish to look at adult material to access these websites, but it is a sensitive area and many will be wary about or even deterred from accessing completely legal websites as a result. Security experts agree that unauthorised hacking of databases is almost inevitable, and the advice to organisations is to prepare contingency plans for when rather than if their databases are accessed by those without authority to do so. The consequences of breaching databases containing sensitive personal data can perhaps be most starkly illustrated by the public exposé of the personal details of those who were members of Ashley Madison, which reportedly resulted in two suicides. The risk to privacy can be reduced if the age verification regulator approves minimum standards for age verification providers. These are set out in the amendment.
The amendment suggests that the age verification regulator publish a code of practice, approved by the Secretary of State and laid before Parliament. The code of practice should ensure that everything possible is done to protect the privacy of users and to allow them to choose which age verification system they trust with their sensitive personal information. For example, some websites provide a service that enables users to prove their identity online, including their age, for purposes unconnected with access to adult material but which could also be used for that purpose. The full extent of the provisions are set out in the amendment, and the evidence in support of the amendment is set out in the Open Rights Group’s updated briefing on the Bill.
The Constitution Committee addressed this issue in its 7th report of 2016-17:
“We are concerned that the extent to which the Bill leaves the details of the age-verification regime to guidance and guidelines to be published by the as yet-to-be-designated regulator adversely affects the ability of the House effectively to scrutinise this legislation. Our concern is exacerbated by the fact that, as the Bill currently stands, the guidance and guidelines will come into effect without any parliamentary scrutiny at all. The House may wish to consider whether it would be appropriate for a greater degree of detail to be included on the face of the bill”.
That is exactly what this amendment attempts to do. I beg to move.
My Lords, I want to say a few words because I have been working on the issue of age verification for a long time. I became interested in it when it became apparent a couple of years ago that it was going to come to the top of the agenda. For the last year or so, the Digital Policy Alliance, which I chair, has been working with the British Standards Institution to produce a publicly available specification—PAS 1296—exactly on this issue. Its whole point is to enable anonymised verification of the attribute of your age. People have said that you would have to give the information to the adult content site, the porn site, but you do not necessarily need to.
There are two stages: when the child, or the adult, first arrives at the site; and, if they are allowed into the site, what they then do. At the point when they come to the front page of the site, where they should be asked to prove their age, there should be an option—and this is the point about anonymity—that allows them to bounce off, with a token, to an age verifier. I have on my smartphone, for instance, one from Yoti. I can identify myself to Yoti; it knows about me and can send an encrypted token back to the website, which does not contain any identity information at all—purely the fact that I am over 18. If the regulator later needs to unravel the token because it appears that rules have been breached, it is possible to present the token and start unravelling it—but only with proper powers. The point is that a hacker cannot find out who presented that token. So it is possible now to do what is necessary.
That answers the point made by the noble Lord, Lord Maxton. The problem with an identity card is that it will identify you. If you gave your identity to one of these websites and it happened to be hacked, like Ashley Madison, and if you were a Cabinet Minister—or even like most of us here, actually—your career would probably be in ruins. So I think it is essential that people be permitted anonymity. That is why, I am afraid, I am not in favour of the identity card method. There are other similar ways of doing the same thing—
I would, maybe, accept the noble Earl’s point in this particular context, but the ID card has, of course, a variety of different uses—particularly if it is a smartcard—rather than just this one.
Absolutely; I know what the noble Lord means. I simply meant that this is not necessarily an ID application—except, maybe, to identify yourself to the site which then gives your attribute to the other website.
I am thoroughly in favour of the amendment, and so is the industry. We hope to publish a standard on this in the not-too-distant future, which may help the regulator determine who is a fit and proper person to carry it out.
There is just one other thing I want to say. Once you have done your age verification and then go on to the website, if you then choose to subscribe, and give it your credit card number and everything else, that is up to you. I hope and trust that the sites—I know that they are pretty careful about this—will encrypt properly and guard the information with their lives, if not yours.
I do not want to overload the Front-Bench contributions from this side, or to turn this into a mutual admiration society, but I want to say that the noble Earl, Lord Erroll, has played a blinder in educating many of us in this House about the possibilities and the technologies being developed on anonymised age verification. As the Minister probably knows, we had a very useful session with many of those developing new apps for this precise purpose. Yoti was one, VeriMe was another—one could go on. There are different types of age verification, which can be chosen by the consumer. The most recent, which is now virtually available for general use, is Yoti, which the noble Earl mentioned. These methods are now available for use; this is not a question of pie in the sky, or of things not being available for a year or so. That makes the amendment highly practical, and, as my noble friend said, it is absolutely essential for the protection of personal privacy.
My Lords, I had not intended to speak on this point, but this may be relevant evidence. Last year, I went to a meeting with a parliamentary group that was looking at hate speech issues, and a representative of Facebook was there. She said—one may say that this did not show quite a correct view of freedom of expression—that Facebook takes down whatever its customers find offensive. A member of the public said, “Actually, when you have had 20 independent complaints, you take it down and it is immediately put up again”. That second step is where the remedies are not working at present. It does not get taken down. This was mainly about anti-Semitic hate speech of a vile sort that would have been well known in certain quarters in the 1930s. This is an urgent matter, which we need some remedy for.
My Lords, it has been suggested to me that this group of amendments could also be used in the code of practice and the safety responsibilities could also be drawn up to include non-age-verified pornography.
My Lords, the Government take the harm caused by online abuse and harassment very seriously, and we will continue to invest in law enforcement capabilities to ensure that all online crime is dealt with properly.
Amendment 70 would require the Government to carry out a review of online abuse and lay a report before Parliament within six months of Royal Assent. We do not believe that it is necessary to include provision for a review in primary legislation. As part of the ending violence against women and girls strategy, we have established an official government working group to map out the current issues, prevalence, initiatives and barriers to addressing gendered online abuse and to produce an action plan.
We are absolutely clear that abusive and threatening behaviour is totally unacceptable in any form, either offline or online. As the Committee will be aware, any action that is illegal when committed offline is also illegal if committed online. Current legislation, some of which was passed before the digital age, has shown itself to be flexible and capable of catching and punishing offenders, whether their crimes were committed by digital means or otherwise. The Protection from Harassment Act 1997 was amended to introduce two new stalking offences to cover conduct that takes place online as well as offline. In addition, the Government will be introducing a new civil stalking protection order to protect victims further.
We will continue to take action where we find gaps in the legislation, just as we did with cyberstalking, harassment and the perpetrators of grossly offensive, obscene or menacing behaviour, and of course we introduced a new law making the fast-growing incidence of revenge porn a specific criminal offence.
The Law Commission recently consulted on including a review of the law covering online abuse as part of its 13th programme of law reform, which will launch later this year. It is expected to confirm with Ministers shortly which projects it proposes should be included.
We are also working to tackle online abuse in schools and have invested £1.6 million to fund a number of anti-bullying organisations.
In addition, we are working to improve the enforcement response to online abuse and harassment so that it can respond to changing technologies. The Home Office has also allocated £4.6 million for a digital transformation programme to equip forces with the tools to police the digital age effectively and to protect the victims of digital crime, including online abuse and harassment. Police and prosecutors evidence offences carried out digitally, non-digitally or both. The CPS Guidelines on Prosecuting Cases Involving Communications Sent via Social Media makes clear the range of criminal law which can be brought to bear on offences committed through social media. Moreover, from April 2015, police forces have been recording online instances of crimes, including stalking and harassment.
I shall talk about the next three amendments together, as they all cover the duties of social media sites. Amendment 71AA seeks to make it a requirement for all social media sites to carry out a safety impact assessment. Amendment 71AB seeks to require Ministers to issue a code of practice to ensure that commercial social media platform providers make a consistent and robust response to online abuse on their sites by identifying and assessing online abuse. Amendment 233A seeks to impose a duty on social media services to respond to reports posted on their sites of material which passes the criminal test—that is, that the content would, if published by other means or communicated in person, cause a criminal offence to be committed.
The Government expect social media and interactive services to have robust processes in place that can quickly address inappropriate content and abusive behaviour on their sites. On the point made by the noble Baroness, Lady O’Neill, it is incumbent on all social media companies to provide an effective means for users to report content and perform the actions that they say they will take to deal with this. We believe a statutory code of practice is unworkable because there is no one-size-fits-all solution. Dealing properly with inappropriate content and abuse will vary by service and incident. Technological considerations might differ by platform and as innovation develops. Users will benefit most if companies develop their own bespoke approach for reporting tools and in-house processes.
Social media companies take down content that is violent or incites violence if it breaches their terms and conditions. We expect them to inform the police where they identify significant threats or illegal activity happening on their sites. It is, however, extremely difficult to identify where the threat has come from and whether it is serious. We work closely with companies to flag terrorist-related content and have so far secured the voluntary removal of over 250,000 pieces of content since 2010.
I can assure the Committee that we share the sentiments expressed in these amendments. At the moment, though, they are not practical or necessary, so I hope on that basis noble Lords will not press their amendments.