Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateNigel Evans
Main Page: Nigel Evans (Conservative - Ribble Valley)Department Debates - View all Nigel Evans's debates with the Department for Digital, Culture, Media & Sport
(2 years, 4 months ago)
Commons ChamberWe will now introduce a six-minute limit on speeches. It may come down but, if Members can take less than six minutes, please do so. I intend to call the Minister at 4.20 pm.
May I, on behalf of my party, welcome the Minister to his place?
I have been reflecting on the contributions made so far and why we are here. I am here because I know of a female parliamentary candidate who pulled out of that process because of the online abuse. I also know of somebody not in my party—it would be unfair to name her or her party—who stood down from public life in Scotland mostly because of online abuse. This is something that threatens democracy, which we surely hold most dear.
Most of us are in favour of the Bill. It is high time that we had legislation that keeps users safe online, tackles illegal content and seeks to protect freedom of speech, while also enforcing the regulation of online spaces. It is clear to me from the myriad amendments that the Bill as it currently stands is not complete and does not go far enough. That is self-evident. It is a little vague on some issues.
I have tabled two amendments, one of which has already been mentioned and is on media literacy. My party and I believe Ofcom should have a duty to promote and improve the media literacy of the public in relation to regulated user-to-user services and search services. That was originally in the Bill but it has gone. Media literacy is mentioned only in the context of risk assessments. There is no active requirement for internet companies to promote media literacy.
The pandemic proved that a level of skill is needed to navigate the online world. I offer myself as an example. The people who help me out in my office here and in my constituency are repeatedly telling me what I can and cannot do and keeping me right. I am of a certain age, but that shows where education is necessary.
My second amendment is on end-to-end encryption. I do not want anything in this Bill to prevent providers of online services from protecting their users’ privacy through end-to-end encryption. It does provide protection to individuals and if it is circumvented or broken criminals and hostile foreign states can breach security. Privacy means security.
There are also concerns about the use of the word “harm” in the Bill. It remains vague and threatens to capture a lot of unintended content. I look forward to seeing what comes forward from the Government on that front. It focuses too much on content as opposed to activity and system design. Regulation of social media must respect the rights to privacy and free expression of those who use it. However, as the right hon. Member for Barking (Dame Margaret Hodge) said, that does not mean a laissez-faire approach: bullying and abuse prevent people from expressing themselves and must at all costs be stamped out, not least because of the two examples I mentioned at the start of my contribution.
As I have said before, the provisions on press exemption are poorly drafted. Under the current plans, the Russian propaganda channel Russia Today, on which I have said quite a bit in this place in the past, would qualify as a recognised news publisher and would therefore be exempt from regulation. That cannot be right. It is the same news channel that had its licence revoked by Ofcom.
I will help you by being reasonably brief, Mr Deputy Speaker, and conclude by saying that as many Members have said, the nature of the Bill means that the Secretary of State will have unprecedented powers to decide crucial legislation later. I speak—I will say it again—as a former chair of the Scottish Parliament’s statutory instruments committee, so I know from my own experience that all too often, instruments that have far-reaching effects are not given the consideration in this place that they should receive. Such instruments should be debated by the rest of us in the Commons.
As I said at the beginning of my speech, the myriad amendments to the Bill make it clear that the rest of us are not willing to allow it to remain so inherently undemocratic. We are going in the right direction, but a lot can be done to improve it. I wait with great interest to see how the Minister responds and what is forthcoming in the period ahead.
Order. We will stick with a time limit of six minutes, but I put everybody on notice that we may have to move that down to five.
I very much welcome the Bill, which has been a long time in the making. It has travelled from my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) to my hon. Friend the Member for Croydon South (Chris Philp) and now to my hon. Friend the Member for Folkestone and Hythe (Damian Collins); I say a huge thank you to them for their work. The Bill required time because this is a very complex matter. There are huge dangers and challenges in terms of committing offences against freedom of speech. I am glad that Ministers have recognised that and that we are very close to an outcome.
The Bill is really about protection—it is about protecting our children and our society from serious harms—and nobody here would disagree that we want to protect children from harm online. That is what 70% to 80% of the Bill achieves. Nobody would disagree that we need to prevent acts of terror and incitement to violence. We are all on the same page on that across the House. What we are talking about today, and what we have been talking about over the past several months, are nips and tucks to try to improve elements of the Bill. The framework appears to be generally correct. We need to drill down into some of the details to ensure that the areas that each of us is concerned about are dealt with in the Bill we finally produce, as it becomes an Act of Parliament.
There are several amendments tabled in my name and those of other right hon. and hon. Members. I can only canter through them cursorily in the four minutes and 30 seconds remaining to me, but I will put these points on the record in the hope that the Minister will respond positively to many of them.
Amendments 48 and 49 would ensure that providers can decide to keep user-generated content online, taking no action if that content is not harmful. In effect, the Government have accepted those amendments by tabling amendment 71, so I thank the Minister for that.
My amendment 50 says that the presumption should be tipped further in favour of freedom of expression and debate by ensuring that under their contractual terms of service, except in particular circumstances, providers are obliged to leave content online. I emphasise that I am not talking about harmful or illegal content; amendment 50 seeks purely to address content that may be controversial but does not cross the line.
I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.
Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.
I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term
“likely to be accessed by children”
appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition
I am grateful to the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.
I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).
As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.
Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:
“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.
I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.
Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.
Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.
I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.
We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.
In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.
You will resume your seat no later than 4.20 pm. We will therefore not put the clock on you.
I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.
The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.
The hon. Member for Ochil and South Perthshire (John Nicolson) raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.
My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.
I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.
I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.
A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend the Member for Windsor (Adam Afriyie) at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so.
The hon. Member for Batley and Spen (Kim Leadbeater) mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend the Member for Watford (Dean Russell); he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.
Finally, in response to the speech by the hon. Member for Aberdeen North (Kirsty Blackman), I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.
To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.
I am anticipating another Division, as I said, and then I understand there may be some points of order, which I will hear after that Division.
That concludes proceedings on new clauses, new schedules and amendments to those parts of the Bill that have to be concluded by 4.30 pm.
It has been pointed out to me that, in this unusually hot weather, Members should please remember to drink more water. I tried it myself once. [Laughter.]
In accordance with the programme (No. 2) order of today, we now come to new clauses, new schedules and amendments relating to those parts of the Bill to be concluded by 7 pm. We begin with new clause 14, which the House has already debated. I therefore call the Minister to move new clause 14 formally.
New Clause 14
Providers’ judgements about the status of content
“(1) This section sets out the approach to be taken where—
(a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or
(b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.
(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.
(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—
(a) the size and capacity of the provider, and
(b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.
(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is—
(a) illegal content, or illegal content of a particular kind, or
(b) a fraudulent advertisement.
(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).
(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—
(a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and
(b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.
(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).
(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).
(9) In this section—
“fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question);
“illegal content” has the same meaning as in Part 3 (see section 52);
“relevant requirements” means—
(a) duties and requirements under this Act, and
(b) requirements of a notice given by OFCOM under this Act.”—(Damian Collins.)
This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.
Brought up.
Question put, That the clause be added to the Bill.
I will only allow three more points of order, because this is eating into time for very important business. [Interruption.] They are all similar points of order and we could carry on with them until 7 o’clock, but we are not going to do so.
Further to that point of order, Mr Deputy Speaker. At the Public Administration and Constitutional Affairs Committee this morning, Sir John Major presented evidence to us about propriety and ethics. In that very sombre presentation, he talked about being
“at the top of a slope”
down towards the loss of democracy in this country. Ultimately, the will of Parliament is all we have, so if we do not have Parliament to make the case, what other option do we have?
Order. I ask the final Members please to show restraint as far as language is concerned, because I am not happy with some of the language that has been used.
Further to that point of order, Mr Deputy Speaker. There have been 50 resignations of Ministers; the Government are mired in controversy; people are acting up as Ministers who are not quite Ministers, as I understand it; and legislation is being delayed. When was there ever a better time for the House to table a motion of no confidence in a Government? This is a cowardly act not by the Prime Minister, but by the Conservative party, which does not want a vote on this issue. Conservative Members should support the move to have a vote of no confidence and have the courage to stand up for their convictions.
Further to that point of order, Mr Deputy Speaker. Can you inform the House of whether Mr Speaker has received any explanation from the Government for this craven and egregious breach of parliamentary convention? If someone were to table a motion under Standing Order No. 24 for tomorrow, has he given any indication of what his attitude would be towards such a motion?
I will answer the question about Standing Order No. 24 first, because I can deal with it immediately: clearly, if an application is made, Mr Speaker will determine it himself.
The principles concerning motions of no confidence are set out at paragraph 18.44 of “Erskine May”, which also gives examples of motions that have been debated and those that have not. “May” says:
“By established convention, the Government always accedes to the demand from the Leader of the Opposition to allot a day for the discussion of a motion tabled by the official Opposition which, in the Government’s view, would have the effect of testing the confidence of the House.”
I can only conclude, therefore, that the Government have concluded that the motion, as tabled by the official Opposition, does not have that effect. That is a matter for the Government, though, rather than for the Chair.
May I say that there are seven more sitting days before recess? As Deputy Speaker, I would anticipate that there will be further discussions.
We now have to move on with the continuation of business on the Bill.
New Clause 7
Duties regarding user-generated pornographic content: regulated services
“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.
(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.
(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.
(4) For the meaning of ‘pornographic content’, see section 66(2).
(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.
(6) For the meaning of ‘regulated service’, see section 2(4).”—(Dame Diana Johnson.)
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
Online Safety Bill Debate
Full Debate: Read Full DebateNigel Evans
Main Page: Nigel Evans (Conservative - Ribble Valley)Department Debates - View all Nigel Evans's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberIt is a privilege to follow my hon. Friend the Member for Watford (Dean Russell) and so many hon. Members who have made thoughtful contributions. I will confine my comments to the intersection of new clauses 28 and 45 to 50 with the impact of online pornography on children in this country.
There has been no other time in the history of humanity when we have exposed children to the violent, abusive, sexually explicit material that they currently encounter online. In 2008, only 14% of children under 13 had seen pornography; three years later, that figure had risen to 49%, correlating with the rise in children owning smartphones. Online pornography has a uniquely pernicious impact on children. For very young children, there is an impact just from seeing the content. For older teenagers, there is an impact on their behaviour.
We are seeing more and more evidence of boys exhibiting sexually aggressive behaviour, with actions such as strangulation, which we have dealt with separately in this House, and misogynistic attitudes. Young girls are being conditioned into thinking that their value depends on being submissive or objectified. That is leading children down a pathway that leads to serious sexual offending by children against children. Overwhelmingly, the victims are young girls.
Hon. Members need not take my word for it: after Everyone’s Invited began documenting the nature and extent of the sexual experiences happening in our schools, an Ofsted review revealed that the most prevalent victims of serious sexual assaults among the under-25s are girls aged 15 to 17. In a recent publication in anticipation of the Bill, the Children’s Commissioner cited the example of a teenage boy arrested for his part in the gang rape of a 14-year old girl. In his witness statement to the police, the boy said that it felt just like a porn film.
Dr John Foubert, the former White House adviser on rape prevention, has said:
“It wasn’t until 10 years ago when I came to the realization that the secret ingredient in the recipe for rape was not secret at all…That ingredient…is today’s high speed Internet pornography.”
The same view has been expressed, in one form or another, by the chief medical officers for England and for Wales, the Independent Inquiry into Child Sexual Abuse, the Government Equalities Office, the Children’s Commissioner, Ofsted and successive Ministers.
New clause 28 requests an advocacy body to represent and protect the interests of child users. I welcome the principle behind the new clause. I anticipate that the Minister will say that he is already halfway there by making the Children’s Commissioner a statutory consultee to Ofcom, along with the Domestic Abuse Commissioner and others who have been named in this debate. However, whatever the Government make of the Opposition’s new clause, they must surely agree that it alights on one important point: the online terrain in respect of child protection is evolving very fast.
By the time the Bill reaches the statute book, new providers will have popped up again. With them will come unforeseen problems. When the Bill was first introduced, TikTok did not exist, as my hon. Friend the Member for Watford said a moment ago, and neither did OnlyFans. That is precisely the kind of user-generated site that is likely to try and dodge its obligations to keep children safe from harm, partly because it probably does not even accept that it exposes them to harm: it relies on the fallacy that the user is in control, and operates an exploitative business model predicated on that false premise.
I think it important for someone to represent the issue of child protection on a regular basis because of the issue of age verification, which we have canvassed, quite lightly, during the debate. Members on both sides of the House have pointed out that the current system which allows children to self-certify their date of birth is hopelessly out of date. I know that Ministers envisage something much more ambitious with the Bill’s age assurance and age verification requirements, including facial recognition technology, but I think it is worth our having a constant voice reporting on the adequacy of whatever age assurance steps internet providers may take, because we know how skilful children can be in navigating the internet. We know that there are those who have the technological skills to IP shroud or to use VPN. I also think it important for there to be a voice to maintain the pressure on the Government—which is what I myself want to do tonight—for an official Government inquiry into pornography harms, akin to the one on gambling harms that was undertaken in 2019. That inquiry was extremely important in identifying all the harm that was caused by gambling. The conclusions of an equivalent inquiry into pornography would leave no wriggle room for user-generated services to deny the risk of harm.
My right hon. Friend the Member for Basingstoke (Dame Maria Miller) pointed out, very sensibly, that her new clauses 45 to 50 build on all the Law Commission’s recommendations. It elides with so much work that has already been done in the House. We have produced, for instance, the Domestic Abuse Act 2021, which dealt with revenge porn, whether threatened or actual and whether genuine or fake, and with coercive control. Many Members recognise what was achieved by all our work a couple of years ago. However, given the indication from Ministers that they are minded to accept the new clauses in one form or another, I should like them to explain to the House how they think the Bill will capture the issue of sexting, if, indeed, it will capture that issue at all.
As the Minister will know, sexting means the exchanging of intimate images by, typically, children, sometimes on a nominally consensual basis. Everything I have read about it seems to say, “Yes, prima facie this is an unlawful act, but no, we do not seek to criminalise children, because we recognise that they make errors of judgment.” However, while I agree that it may be proportionate not to criminalise children for doing this, it remains the case that when an image is sent with the nominal consent of the child—it is nearly always a girl—it is often a product of duress, the image is often circulated much more widely than the recipient, and that often has devastating personal consequences for the young girl involved. All the main internet providers now have technology that can identify a nude image. It would be possible to require them to prevent nude images from being shared when, because of extended age-verification abilities, they know that the user is a child. If the Government are indeed minded to accept new clauses 45 to 50, I should like them to address that specific issue of sexting rather than letting it fall by the wayside as something separate, or outside the ambit of the Bill.
Thank you, Mr Deputy Speaker. I think you are the third person to take the Chair during the debate. It is an honour to follow my hon. Friend the Member for Newbury (Laura Farris); I agree with everything that she said, and my comments will be similar.
This has been a long but fascinating debate. We have discussed only a small part of the Bill today, and just a few amendments, but the wide range of the debate reflects the enormous complexity of what the Bill is intended to do, which is to regulate the online world so that it is subject to rules, regulations, obligations and protective measures equivalent to those in the offline world. We must do this, because the internet is now an essential part of our infrastructure. I think that we see the costs of our high-speed broadband as being in the same category as our energy and water costs, because we could not live without it. Like all essential infrastructure, the internet must be regulated. We must ensure that providers are working in the best interests of consumers, within the law and with democratic accountability.
Regulating the internet through the Bill is not a one-off project. As many Members have said, it will take years to get it right, but we must begin now. I think the process can be compared with the regulation of roads. A century ago there were hardly any private motor cars on the roads. There were no rules; people did not even have to drive on a particular side of the road. There have been more than 100 years of frequent changes to rules and regulations to get it right. It seems crazy now to think there was a time when there were no speed limits and no seat belts. The death rates on the roads, even in the 1940s, were 13 times higher than they are now. Over time, however, with regulation, we have more or less solved the complex problems of road regulation. Similarly, it will take time to get this Bill right, but we must get it on to the statute book and give it time to evolve.
Online Safety Bill Debate
Full Debate: Read Full DebateNigel Evans
Main Page: Nigel Evans (Conservative - Ribble Valley)Department Debates - View all Nigel Evans's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Commons ChamberIn a nutshell, we must be able to threaten tech bosses with jail. There is precedent for that—jail sentences for senior managers are commonplace for breaches of duties across a great range of UK legislation. That is absolutely and completely clear, and as a former shadow Attorney General, I know exactly what the law is on this subject. I can say this: we must protect our children and grandchildren from predatory platforms operating for financial gain on the internet. It is endemic throughout the world and in the UK, inducing suicide, self-harm and sexual abuse, and it is an assault on the minds of our young children and on those who are affected by it, including the families and such people as Ian Russell. He has shown great courage in coming out with the tragedy of his small child of 14 years old committing suicide as a result of such activities, as the coroner made clear. It is unthinkable that we will not deal with that. We are dealing with it now, and I thank the Secretary of State and the Minister for responding with constructive dialogue in the short space of time since we have got to grips with this issue.
The written ministerial statement is crystal clear. It says that
“where senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment and fines, will be commensurate with similar offences.”
We can make a comparison, as the right hon. Member for Barking (Dame Margaret Hodge) made clear, with financial penalties in the financial services sector, which is also international. There is also the construction industry, as my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) just said. Those penalties are already on our statute book.
I do not care what the European Union is doing in its legislation. I am glad to know that the Irish legislation, which has been passed and is an Act, has been through different permutations and examinations. The Irish have come up with something that includes similar severe penalties. It can be done. But this is our legislation in this House. We will do it the way that we want to do it to protect our children and families. I am just about fed up with listening to the mealy-mouthed remarks from those who say, “You can’t do it. It’s not quite appropriate.” To hell with that. We are talking about our children.
On past record, which I just mentioned, in 1977-78, a great friend of mine, Cyril Townsend, the Member for Bexleyheath, introduced the first Protection of Children Bill. He asked me to help him, and I did. We got it through. That was incredibly difficult at the time. You have no idea, Mr Deputy Speaker, how much resistance was put up by certain Members of this House, including Ministers. I spoke to Jim Callaghan—I have been in this House so long that I was here with him after he had been Prime Minister—and asked, “How did you give us so much time to get the Bill through?” He said, “It’s very simple. I was sitting in bed with my wife in the flat upstairs at No. 10. She wasn’t talking to me. I said, ‘What’s wrong, darling?’ She replied, ‘If you don’t get that Protection of Children Bill through, I won’t speak to you for six months.’” And it went through, so there you go. There is a message there for all Secretaries of State, and even Prime Ministers.
I raised this issue with the Prime Minister in December in a question at the Liaison Committee. I invited him to consider it, and I am so glad that we have come to this point after very constructive discussion and dialogue. It needed that. It is a matter not of chariots of fire but of chariots on fire, because we have done all this in three weeks. I am extremely grateful to the 51 MPs who stood firm. I know the realities of this House, having been involved in one or two discussions in the past. As a rule, it is only when you have the numbers that the results start to come. I pay tribute to the Minister for the constructive dialogue.
The Irish legislation will provide a model, but this will be our legislation. It will be modelled on some of the things that have already enacted there, but it is not simply a matter of their legislation being transformed into ours. It will be our legislation. In the European Parliament—
I too rise to speak to new clause 2, which seeks to introduce senior manager criminal liability to the Bill. As my hon. Friend the Member for Stone (Sir William Cash) set out, we will not push it to a vote as a result of the very welcome commitments that the Minister has made to introduce a similar amendment in the other place.
Protecting children is not just the role of parents but the responsibility of the whole of society, including our institutions and businesses that wish to trade here. That is the primary aim of this Bill, which I wholeheartedly support: to keep children safe online from horrendous and unspeakable harms, many of which were mentioned by my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom).
We look back in horror at children being forced to work down mines or neglected in Victorian orphanages, but I believe we will look back with similar outrage at online harms. What greater violation could there be of childhood than to entice a child to collaborate in their own sexual abuse in the privacy and supposed safety of their own bedroom? Yet this is one of the many crimes that are occurring on an industrial scale every day. Past horrors such as children down mines were tackled by robust legislation, and the Online Safety Bill must continue our Parliament’s proud tradition of taking on vested interests to defend the welfare of children.
The Bill must succeed in its mission, but in its present form, it does not have sufficient teeth to drive the determination that is needed in tech boardrooms to tackle the systemic issue of the malevolent algorithms that drive this sickening content to our children. There is no doubt that the potential fines in the Bill are significant, but many of these companies have deep pockets, and the only criminal sanctions are for failure to share data with Ofcom. The inquest following the tragic death of Molly Russell was an example of this, as no one could be held personally responsible for what happened to her. I pay tribute to Ian Russell, Molly’s father, whose courage in the face of such personal tragedy has made an enormous difference in bringing to light the extent of online harms.
Only personal criminal liability will drive proactive change, and we have seen this in other areas such as the financial services industry and the construction industry. I am delighted that the Government have recognised the necessity of senior manager liability for tech bosses, after much campaigning across the House, and committed to introducing it in the other place. I thank the Secretary of State and her team for the very constructive and positive way in which they have engaged with supporters of this measure.