All 2 John Penrose contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Mon 5th Dec 2022
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments

Online Safety Bill

John Penrose Excerpts
Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

That is for an investigation by the appropriate enforcement agency—Ofcom et al.—and if there is evidence that culpability rests with the managing director, the owner or whoever, they should be prosecuted. It is as simple as that. A case would have to be established through evidence, and that should be carried out by the enforcement agency. I do not think that this is any different from any other form of financial or other crime. In fact, it is from my experience in that that I came to this conclusion.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - -

The right hon. Lady is making a powerful case, particularly on the effective enforcement of rules to ensure that they bite properly and that people genuinely pay attention to them. She gave the example of a senior executive talking about whether people should be stopped for getting it wrong—I think the case she mentioned was holocaust denial—by making factually inaccurate statements or allowing factually inaccurate statements to persist on their platform. May I suggest that her measures would be even stronger if she were to support new clause 34, which I have tabled? My new clause would require factual inaccuracy to become wrong, to be prevented and to be pursued by the kinds of regulators she is talking about. It would be a much stronger basis on which her measure could then abut.

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

--- Later in debate ---
Although I am pleased that the Bill is back before us today, I am disappointed that aspects have been weakened since we last considered it, and urge the Government to consider closely some proposals we will vote on this evening, which would go a considerable way to ensuring that the online world is a safer place for children and adults, works in the interests of users, and holds platforms accountable and responsible for protecting us all online.
John Penrose Portrait John Penrose
- View Speech - Hansard - -

It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.

I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.

We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.

One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.

The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.

There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.

New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”

The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?

John Penrose Portrait John Penrose
- Hansard - -

Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.

Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.

I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.

Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- View Speech - Hansard - - - Excerpts

I wish to address new clauses 16 and 28 to 30, and perhaps make a few passing comments on some others along the way. Many others who, like me, were in the Chamber for the start of the debate will I suspect feel like a broken record, because we keep revisiting the same issues and raising the same points again and again, and I am going to do exactly that.

First, I will speak about new clause 16, which would create a new offence of encouraging or assisting serious self-harm. I am going to do so because I am the chair of the all-party parliamentary group on suicide and self-harm prevention, and we have done a good deal of work on looking at the issue of self-harm and young people in the last two years. We know that suicide is the leading cause of death in men aged under 50 years and females aged under 35 years, with the latest available figures confirming that 5,583 people in England and Wales tragically took their own lives in 2021. We know that self-harm is a strong risk factor for future suicidal ideation, so it is really important that we tackle this issue.

The internet can be an invaluable and very supportive place for some people who are given the opportunity to access support, but for other people it is difficult. The information they see may provide access to content that acts to encourage, maintain or exacerbate self-harm and suicidal behaviours. Detailed information about methods can also increase the likelihood of imitative and copycat suicide, with risks such as contagion effects also present in the online environment.

Online Safety Bill

John Penrose Excerpts
Consideration of Lords amendments
Tuesday 12th September 2023

(1 year, 2 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Commons Consideration of Lords Amendments as at 12 September 2023 - (12 Sep 2023)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady asks an important question, and that is the essence of what we are doing. We have tried to make this Bill flexible and proportionate. It is not technology specific, so that it is as future-proofed as possible. We must obviously lean into Ofcom as it seeks to operationalise the Act once the Bill gains Royal Assent. Ofcom will come back with its reporting, so not only will Government and the Department be a check on this, but Parliament will be able to assess the efficacy of the Bill as the system beds in and as technology and the various platforms move on and develop.

I talked about the offences, and I will just finalise my point about criminal liability. Those offences will be punishable with up to two years in prison.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - -

Further to that point about the remaining gaps in the Bill, I appreciate what the Minister says about this area being a moving target. Everybody—not just in this country, but around the world—is having to learn as the internet evolves.

I thank the Minister for Government amendment 241, which deals with provenance and understanding where information posted on the web comes from, and allows people therefore to check whether they want to see it, if it comes from dubious sources. That is an example of a collective harm—of people posting disinformation and misinformation online and attempting to subvert our democratic processes, among other things. I park with him, if I may, the notion that we will have to come back to that area in particular. It is an area where the Bill is particularly weak, notwithstanding all the good stuff it does elsewhere, notably on the areas he has mentioned. I hope that everyone in this House accepts that that area will need to be revisited in due course.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Undoubtedly we will have to come back to that point. Not everything needs to be in the Bill at this point. We have industry initiatives, such as Adobe’s content security policy, which are good initiatives in themselves, but as we better understand misinformation, disinformation, deepfakes and the proliferation and repetition of fake images, fake text and fake news, we will need to keep ensuring we can stay ahead of the game, as my hon. Friend said. That is why we have made the legislation flexible.