Online Safety Bill Debate
Full Debate: Read Full DebateNigel Evans
Main Page: Nigel Evans (Conservative - Ribble Valley)Department Debates - View all Nigel Evans's debates with the Department for Digital, Culture, Media & Sport
(2 years, 5 months ago)
Commons ChamberWe will now introduce a six-minute limit on speeches. It may come down but, if Members can take less than six minutes, please do so. I intend to call the Minister at 4.20 pm.
May I, on behalf of my party, welcome the Minister to his place?
I have been reflecting on the contributions made so far and why we are here. I am here because I know of a female parliamentary candidate who pulled out of that process because of the online abuse. I also know of somebody not in my party—it would be unfair to name her or her party—who stood down from public life in Scotland mostly because of online abuse. This is something that threatens democracy, which we surely hold most dear.
Most of us are in favour of the Bill. It is high time that we had legislation that keeps users safe online, tackles illegal content and seeks to protect freedom of speech, while also enforcing the regulation of online spaces. It is clear to me from the myriad amendments that the Bill as it currently stands is not complete and does not go far enough. That is self-evident. It is a little vague on some issues.
I have tabled two amendments, one of which has already been mentioned and is on media literacy. My party and I believe Ofcom should have a duty to promote and improve the media literacy of the public in relation to regulated user-to-user services and search services. That was originally in the Bill but it has gone. Media literacy is mentioned only in the context of risk assessments. There is no active requirement for internet companies to promote media literacy.
The pandemic proved that a level of skill is needed to navigate the online world. I offer myself as an example. The people who help me out in my office here and in my constituency are repeatedly telling me what I can and cannot do and keeping me right. I am of a certain age, but that shows where education is necessary.
My second amendment is on end-to-end encryption. I do not want anything in this Bill to prevent providers of online services from protecting their users’ privacy through end-to-end encryption. It does provide protection to individuals and if it is circumvented or broken criminals and hostile foreign states can breach security. Privacy means security.
There are also concerns about the use of the word “harm” in the Bill. It remains vague and threatens to capture a lot of unintended content. I look forward to seeing what comes forward from the Government on that front. It focuses too much on content as opposed to activity and system design. Regulation of social media must respect the rights to privacy and free expression of those who use it. However, as the right hon. Member for Barking (Dame Margaret Hodge) said, that does not mean a laissez-faire approach: bullying and abuse prevent people from expressing themselves and must at all costs be stamped out, not least because of the two examples I mentioned at the start of my contribution.
As I have said before, the provisions on press exemption are poorly drafted. Under the current plans, the Russian propaganda channel Russia Today, on which I have said quite a bit in this place in the past, would qualify as a recognised news publisher and would therefore be exempt from regulation. That cannot be right. It is the same news channel that had its licence revoked by Ofcom.
I will help you by being reasonably brief, Mr Deputy Speaker, and conclude by saying that as many Members have said, the nature of the Bill means that the Secretary of State will have unprecedented powers to decide crucial legislation later. I speak—I will say it again—as a former chair of the Scottish Parliament’s statutory instruments committee, so I know from my own experience that all too often, instruments that have far-reaching effects are not given the consideration in this place that they should receive. Such instruments should be debated by the rest of us in the Commons.
As I said at the beginning of my speech, the myriad amendments to the Bill make it clear that the rest of us are not willing to allow it to remain so inherently undemocratic. We are going in the right direction, but a lot can be done to improve it. I wait with great interest to see how the Minister responds and what is forthcoming in the period ahead.
Order. We will stick with a time limit of six minutes, but I put everybody on notice that we may have to move that down to five.
I very much welcome the Bill, which has been a long time in the making. It has travelled from my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) to my hon. Friend the Member for Croydon South (Chris Philp) and now to my hon. Friend the Member for Folkestone and Hythe (Damian Collins); I say a huge thank you to them for their work. The Bill required time because this is a very complex matter. There are huge dangers and challenges in terms of committing offences against freedom of speech. I am glad that Ministers have recognised that and that we are very close to an outcome.
The Bill is really about protection—it is about protecting our children and our society from serious harms—and nobody here would disagree that we want to protect children from harm online. That is what 70% to 80% of the Bill achieves. Nobody would disagree that we need to prevent acts of terror and incitement to violence. We are all on the same page on that across the House. What we are talking about today, and what we have been talking about over the past several months, are nips and tucks to try to improve elements of the Bill. The framework appears to be generally correct. We need to drill down into some of the details to ensure that the areas that each of us is concerned about are dealt with in the Bill we finally produce, as it becomes an Act of Parliament.
There are several amendments tabled in my name and those of other right hon. and hon. Members. I can only canter through them cursorily in the four minutes and 30 seconds remaining to me, but I will put these points on the record in the hope that the Minister will respond positively to many of them.
Amendments 48 and 49 would ensure that providers can decide to keep user-generated content online, taking no action if that content is not harmful. In effect, the Government have accepted those amendments by tabling amendment 71, so I thank the Minister for that.
My amendment 50 says that the presumption should be tipped further in favour of freedom of expression and debate by ensuring that under their contractual terms of service, except in particular circumstances, providers are obliged to leave content online. I emphasise that I am not talking about harmful or illegal content; amendment 50 seeks purely to address content that may be controversial but does not cross the line.
I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.
Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.
I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term
“likely to be accessed by children”
appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition
I am grateful to the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.
I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).
As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.
Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:
“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.
I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.
Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.
Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.
I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.
We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.
In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.
You will resume your seat no later than 4.20 pm. We will therefore not put the clock on you.
I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.
The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.
The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.
My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.
I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.
I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.
A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.
The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.
Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.
To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.
I am anticipating another Division, as I said, and then I understand there may be some points of order, which I will hear after that Division.
That concludes proceedings on new clauses, new schedules and amendments to those parts of the Bill that have to be concluded by 4.30 pm.
It has been pointed out to me that, in this unusually hot weather, Members should please remember to drink more water. I tried it myself once. [Laughter.]
In accordance with the programme (No. 2) order of today, we now come to new clauses, new schedules and amendments relating to those parts of the Bill to be concluded by 7 pm. We begin with new clause 14, which the House has already debated. I therefore call the Minister to move new clause 14 formally.
New Clause 14
Providers’ judgements about the status of content
“(1) This section sets out the approach to be taken where—
(a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or
(b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.
(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.
(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—
(a) the size and capacity of the provider, and
(b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.
(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is—
(a) illegal content, or illegal content of a particular kind, or
(b) a fraudulent advertisement.
(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).
(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—
(a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and
(b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.
(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).
(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).
(9) In this section—
“fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question);
“illegal content” has the same meaning as in Part 3 (see section 52);
“relevant requirements” means—
(a) duties and requirements under this Act, and
(b) requirements of a notice given by OFCOM under this Act.”—(Damian Collins.)
This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.
Brought up.
Question put, That the clause be added to the Bill.
I will only allow three more points of order, because this is eating into time for very important business. [Interruption.] They are all similar points of order and we could carry on with them until 7 o’clock, but we are not going to do so.
Further to that point of order, Mr Deputy Speaker. At the Public Administration and Constitutional Affairs Committee this morning, Sir John Major presented evidence to us about propriety and ethics. In that very sombre presentation, he talked about being
“at the top of a slope”
down towards the loss of democracy in this country. Ultimately, the will of Parliament is all we have, so if we do not have Parliament to make the case, what other option do we have?
Order. I ask the final Members please to show restraint as far as language is concerned, because I am not happy with some of the language that has been used.
Further to that point of order, Mr Deputy Speaker. There have been 50 resignations of Ministers; the Government are mired in controversy; people are acting up as Ministers who are not quite Ministers, as I understand it; and legislation is being delayed. When was there ever a better time for the House to table a motion of no confidence in a Government? This is a cowardly act not by the Prime Minister, but by the Conservative party, which does not want a vote on this issue. Conservative Members should support the move to have a vote of no confidence and have the courage to stand up for their convictions.
Further to that point of order, Mr Deputy Speaker. Can you inform the House of whether Mr Speaker has received any explanation from the Government for this craven and egregious breach of parliamentary convention? If someone were to table a motion under Standing Order No. 24 for tomorrow, has he given any indication of what his attitude would be towards such a motion?
I will answer the question about Standing Order No. 24 first, because I can deal with it immediately: clearly, if an application is made, Mr Speaker will determine it himself.
The principles concerning motions of no confidence are set out at paragraph 18.44 of “Erskine May”, which also gives examples of motions that have been debated and those that have not. “May” says:
“By established convention, the Government always accedes to the demand from the Leader of the Opposition to allot a day for the discussion of a motion tabled by the official Opposition which, in the Government’s view, would have the effect of testing the confidence of the House.”
I can only conclude, therefore, that the Government have concluded that the motion, as tabled by the official Opposition, does not have that effect. That is a matter for the Government, though, rather than for the Chair.
May I say that there are seven more sitting days before recess? As Deputy Speaker, I would anticipate that there will be further discussions.
We now have to move on with the continuation of business on the Bill.
New Clause 7
Duties regarding user-generated pornographic content: regulated services
“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.
(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.
(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.
(4) For the meaning of ‘pornographic content’, see section 66(2).
(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.
(6) For the meaning of ‘regulated service’, see section 2(4).”—(Dame Diana Johnson.)
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.