All 3 Adam Afriyie contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022

Online Safety Bill

Adam Afriyie Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - -

Overall, I very much welcome the Bill. It has been a long time coming, but none of us here would disagree that we need to protect our children, certainly from pornography and all sorts of harassment and awful things that are on the internet and online communications platforms. There is no argument or pushback there at all. I welcome the age verification side of things. We all welcome that.

The repeal of the Malicious Communications Act 1988 is a good move. The adjustment of a couple of sections of the Communications Act 2003 is also a really good, positive step, and I am glad that the Bill is before us now. I think pretty much everyone here would agree with the principles of the Bill, and I thank the Government for getting there eventually and introducing it. However, as chair of the freedom of speech all-party parliamentary group I need to say a few words and express a few concerns about some of the detail and some of the areas where the Bill could perhaps be improved still further.

The first point relates to the requirement that social media have regard to freedom of speech. It is very easy, with all the concerns we have—I have them too—to push too hard and say that social media companies should clamp down immediately on anything that could be even slightly harmful, even if it is uncertain what “harmful” actually means. We must not to give them the powers or the incentive through financial penalties to shut down freedom of speech just in case something is seen to be harmful by somebody. As the Bill progresses, therefore, it would be interesting to look at whether there is an area where we can tighten up rights and powers on freedom of speech.

Secondly, there is the huge issue—one or two other Members have raised it—of definitions. Clearly, if we say that something that is illegal should not be there and should disappear, of course we would all agree with that. If we say that something that is harmful should not be there, should not be transmitted and should not be amplified, we start to get into difficult territory, because what is harmful for one person may not be harmful for another. So, again, we need to take a little more of a look at what we are talking about there. I am often called “Tory scum” online. I am thick-skinned; I can handle it. It sometimes happens in the Chamber here—[Laughter.]—but I am thick-skinned and I can handle it. So, what if there was an option online for me to say, “You know what? I am relaxed about seeing some content that might be a bit distasteful for others. I am okay seeing it and hearing it.”? In academic discourse in particular, it is really important to hear the other side of the argument, the other side of a discussion, the other side of a debate. Out of context, one phrase or argument might be seen to be really harmful to a certain group within society. I will just flag the trans debate. Even the mention of the word trans or the words male and female can really ignite, hurt and harm. We could even argue that it is severe harm. Therefore, we need to be very careful about the definitions we are working towards.

Finally, the key principle is that we should ensure that adults who have agency can make decisions for themselves. I hope social media companies can choose not to remove content entirely or amplify content, but to flag content so that grown-ups with agency like us, like a lot of the population, can choose to opt in or to opt out.

Online Safety Bill

Adam Afriyie Excerpts
Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I understand what my hon. Friend is saying, but the list of what is legal but harmful will be set by the Secretary of State, not by Parliament. All we ask is for that to be discussed on the Floor of the House before we place those duties on the companies. That is all I am asking us to do.

Facebook has about 3 billion active users globally. That is more than double the population of China, the world’s most populous nation, and it is well over half the number of internet users in the entire world. These companies are unlike any others we have seen in history. For hundreds of millions of people around the world, they are the public square, which is how the companies have described themselves: Twitter founder Jack Dorsey said in 2018:

“We believe many people use Twitter as a digital public square. They gather from all around the world to see what’s happening, and have a conversation about what they see.”

In 2019, Mark Zuckerberg said:

“Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square.”

Someone who is blocked from these platforms is blocked from the public square, as we saw when the former President of the United States was blocked. Whatever we might think about Donald Trump, it cannot be right that he was banned from Twitter. We have to have stronger protection for free speech in the digital public square than clause 19 gives. The Bill gives the Secretary of State the power to define what is legal but harmful by regulations. As I have said, this is an area where free speech could easily be affected—

Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - -

I commend my hon. Friend for the powerful speech he is making. It seems to many of us here that if anyone is going to be setting the law or a regulation, it should really be done in the Chamber of this House. I would be very happy if we had annual debates on what may be harmful but is currently lawful, in order to make it illegal. I very much concur with what he is saying.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I thank my hon. Friend for his contribution, which deals with what I was going to finish with. It is not enough for the Secretary of State to have to consult Ofcom; there should be public consultation too. I support amendment 55, which my hon. Friend has tabled.

--- Later in debate ---
Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Parliament Live - Hansard - - - Excerpts

Order. We will stick with a time limit of six minutes, but I put everybody on notice that we may have to move that down to five.

Adam Afriyie Portrait Adam Afriyie
- Hansard - -

I very much welcome the Bill, which has been a long time in the making. It has travelled from my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) to my hon. Friend the Member for Croydon South (Chris Philp) and now to my hon. Friend the Member for Folkestone and Hythe (Damian Collins); I say a huge thank you to them for their work. The Bill required time because this is a very complex matter. There are huge dangers and challenges in terms of committing offences against freedom of speech. I am glad that Ministers have recognised that and that we are very close to an outcome.

The Bill is really about protection—it is about protecting our children and our society from serious harms—and nobody here would disagree that we want to protect children from harm online. That is what 70% to 80% of the Bill achieves. Nobody would disagree that we need to prevent acts of terror and incitement to violence. We are all on the same page on that across the House. What we are talking about today, and what we have been talking about over the past several months, are nips and tucks to try to improve elements of the Bill. The framework appears to be generally correct. We need to drill down into some of the details to ensure that the areas that each of us is concerned about are dealt with in the Bill we finally produce, as it becomes an Act of Parliament.

There are several amendments tabled in my name and those of other right hon. and hon. Members. I can only canter through them cursorily in the four minutes and 30 seconds remaining to me, but I will put these points on the record in the hope that the Minister will respond positively to many of them.

Amendments 48 and 49 would ensure that providers can decide to keep user-generated content online, taking no action if that content is not harmful. In effect, the Government have accepted those amendments by tabling amendment 71, so I thank the Minister for that.

My amendment 50 says that the presumption should be tipped further in favour of freedom of expression and debate by ensuring that under their contractual terms of service, except in particular circumstances, providers are obliged to leave content online. I emphasise that I am not talking about harmful or illegal content; amendment 50 seeks purely to address content that may be controversial but does not cross the line.

--- Later in debate ---
Diana Johnson Portrait Dame Diana Johnson
- Parliament Live - Hansard - - - Excerpts

May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?

New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:

“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”

She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.

This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.

In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:

“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.

The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”

One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

Leigh Nicol is spot on.

Unfortunately, when this subject was debated in Committee, the previous Minister, the hon. Member for Croydon South (Chris Philp), argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.

The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:

“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.

Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.

New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.

Adam Afriyie Portrait Adam Afriyie
- Parliament Live - Hansard - -

I echo the concerns expressed by the right hon. Member for Kingston upon Hull North (Dame Diana Johnson). Some appalling abuses are taking place online, and I hope that the Bill goes some way to address them, to the extent that that is possible within the framework that it sets up. I greatly appreciate the right hon. Lady’s comments and her contribution to the debate.

I have a tight and narrow point for the Minister. In amendment 56, I seek to ensure that only pornographic material is caught by the definition in the Bill. My concern is that we catch these abuses online, catch them quickly and penalise them harshly, but also that sites that may display, for example, works of art featuring nudes—or body positivity community sites, of which there are several—are not inadvertently caught in our desire to clamp down on illegal pornographic sites. Perhaps the Minister will say a few words about that in his closing remarks.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Parliament Live - Hansard - - - Excerpts

I rise to speak to this small group of amendments on behalf of the Opposition. Despite everything that is going on at the moment, we must remember that this Bill has the potential to change lives for the better. It is an important piece of legislation, and we cannot miss the opportunity to get it right. I would like to join my hon. Friend the Member for Pontypridd (Alex Davies-Jones) in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins) to his role. His work as Chair of the Joint Committee on this Bill was an important part of the pre-legislative scrutiny process, and I look forward to working in collaboration with him to ensure that this legislation does as it should in keeping us all safe online. I welcome the support of the former Minister, the hon. Member for Croydon South (Chris Philp), on giving access to data to academic researchers and on looking at the changes needed to deal with the harm caused by the way in which algorithmic prompts work. It was a pity he was not persuaded by the amendments in Committee, but better late than never.

--- Later in debate ---
Jeremy Wright Portrait Sir Jeremy Wright
- Parliament Live - Hansard - - - Excerpts

I think it is extraordinarily important that this Bill does what the hon. Member for Worsley and Eccles South (Barbara Keeley) has just described. As the Bill moves from this place to the other place, we must debate what the right balance is between what the Secretary of State must do—in the previous group of amendments, we heard that many of us believe that is too extensive as the Bill stands—what the regulator, Ofcom, must do and what Parliament must do. There is an important judgment call for this House to make on whether we have that balance right in the Bill as it stands.

These amendments are very interesting. I am not convinced that the amendments addressed by the hon. Lady get the balance exactly right either, but there is cause for further discussion about where we in this House believe the correct boundary is between what an independent regulator should be given authority to do under this legislative and regulatory structure and what we wish to retain to ourselves as a legislature.

Adam Afriyie Portrait Adam Afriyie
- Hansard - -

My right hon. and learned Friend is highlighting, and I completely agree, that there is a very sensitive balance between different power bases and between different approaches to achieving the same outcome. Does he agree that as even more modifications are made—the nipping and tucking I described earlier—this debate and future debates, and these amendments, will contribute to those improvements over the weeks and months ahead?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, I agree with my hon. Friend about that. I hope it is some comfort to the hon. Member for Worsley and Eccles South when I say that if the House does not support her amendment, it should not be taken that she has not made a good point that needs further discussion—probably in the other place, I fear. We are going to have think carefully about that balance. It is also important that we do not retain to ourselves as a legislature those things that the regulator ought to have in its own armoury. If we want Ofcom to be an effective and independent regulator in this space, we must give it sufficient authority to fulfil that role. She makes interesting points, although I am not sure I can go as far as supporting her amendments. I know that is disappointing, but I do think that what she has done is prompted a further debate on exactly this balance between Secretary of State, Executive, legislature and regulator, which is exactly where we need to be.

I have two other things to mention. The first relates to new clause 7 and amendment 33, which the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) tabled. She speaks powerfully to a clear need to ensure that this area is properly covered. My question, however, is about practicalities. I am happy to take an intervention if she can answer it immediately. If not, I am happy to discuss it with her another time. She has heard me speak many times about making sure that this Bill is workable. The challenge in what she has described in her amendments may be that a platform needs to know how it is to determine and “verify”—that is the word she has used—that a participant in a pornographic video is an adult and a willing participant. It is clearly desirable that the platform should know both of those things, but the question that will have to be answered is: by what mechanism will it establish that? Will it ask the maker of the pornographic video and be prepared to accept the assurances it is given? If not, by what other mechanism should it do this? For example, there may be a discussion to be had on what technology is available to establish whether someone is an adult or is not—that bleeds into the discussion we have had about age assurance. It may be hard for a platform to establish whether someone is a willing participant.

--- Later in debate ---
Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Yes, I am grateful to the hon. Lady for that useful addition to this debate, but it tends to clarify the point I was seeking to clarify, which is whether or not what the right hon. Member for Kingston upon Hull North has in mind is to ensure that a platform would be expected to make use of those mechanisms that already exist in order to satisfy itself of the things that she rightly asks it to be satisfied of or whether something beyond that would be required to meet her threshold. If it is the former, that is manageable for platforms and perfectly reasonable for us to expect of them. If it is the latter, we need to understand a little more clearly how she expects a platform to achieve that greater assurance. If it is that, she makes an interesting point.

Finally, let me come to amendment 56, tabled by my hon. Friend the Member for Windsor (Adam Afriyie). Again, I have a practical concern. He seeks to ensure that the pornographic content is “taken as a whole”, but I think it is worth remembering why we have included pornographic content in the context of this Bill. We have done it to ensure that children are not exposed to this content online and that where platforms are capable of preventing that from happening, that is exactly what they do. There is a risk that if we take this content as a whole, it is perfectly conceivable that there may be content online that is four hours long, only 10 minutes of which is pornographic in nature. It does not seem to me that that in any way diminishes our requirement of a platform to ensure that children do not see those 10 minutes of pornographic content.

Adam Afriyie Portrait Adam Afriyie
- Hansard - -

I am very sympathetic to that view. I am merely flagging up for the Minister that if we get the opportunity, we need to have a look at it again in the Lords, to be absolutely certain that we are not ruling out certain types of art, and certain types of community sites that we would all think were perfectly acceptable, that are probably not accessible to children, just to ensure that we are not creating further problems down the road that we would have to correct.

Online Safety Bill

Adam Afriyie Excerpts
Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - -

My right hon. Friend will be aware that the measure will encompass every single telephone conversation when it switches to IP. That is data, too.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is correct. The companies cannot easily focus the measure on malicious content alone, and that is the problem. With everything we do in dealing with enforcing the law, we have to balance the extent to which we make the job of the law enforcement agency possible—ideally, easy—against the rights we take away from innocent citizens. That is the key balance. Many bad things happen in households but we do not require people to live in houses with glass walls. That shows the intrinsic problem we have.

--- Later in debate ---
Matt Rodda Portrait Matt Rodda
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.

Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.

There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.

Adam Afriyie Portrait Adam Afriyie
- Parliament Live - Hansard - -

I am pleased to follow my fairly close neighbour from Berkshire, the hon. Member for Reading East (Matt Rodda). He raised the issue of legal but harmful content, which I will come to, as I address some of the amendments before us.

I very much welcome the new shape and focus of the Bill. Our primary duty in this place has to be to protect children, above almost all else. The refocusing of the Bill certainly does that, and it is now in a position where hon. Members from all political parties recognise that it is so close to fulfilling its function that we want it to get through this place as quickly as possible with today’s amendments and those that are forthcoming in the Lords and elsewhere in future weeks.

The emerging piece of legislation is better and more streamlined. I will come on to further points about legal but harmful, but I am pleased to see that removed from the Bill for adults and I will explain why, given the sensitive case that the hon. Member for Reading East mentioned. The information that he talked about being published online should be illegal, so it would be covered by the Bill. Illegal information should not be published and, within the framework of the Bill, would be taken down quickly. We in this place should not shirk our responsibilities; we should make illegal the things that we and our constituents believe to be deeply harmful. If we are not prepared to do that, we cannot say that some other third party has a responsibility to do it on our behalf and we are not going to have anything to do with it, and they can begin to make the rules, whether they are a commercial company or a regulator without those specific powers.

I welcome the shape of the Bill, but some great new clauses have been tabled. New clause 16 suggests that we should make it an offence to encourage self-harm, which is fantastic. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) has indicated that he will not press it to a vote, because the Government and all of us acknowledge that that needs to be dealt with at some point, so hopefully an amendment will be forthcoming in the near future.

On new clause 23, it is clear that if a commercial company is perpetrating an illegal act or is causing harm, it should pay for it, and a proportion of that payment must certainly support the payments to victims of that crime or breach of the regulations. New clauses 45 to 50 have been articulately discussed by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). The technology around revenge pornography and deepfakes is moving forward every day. With some of the fakes online today, it is not possible to tell that they are fakes, even if they are looked at under a microscope. Those areas need to be dealt with, but it is welcome that she will not necessarily press the new clauses to a vote, because those matters must be picked up and defined in primary legislation as criminal acts. There will then be no lack of clarity and we will not need the legal but harmful concept—that will not need to exist. Something will either be illegal, because it is harmful, or not.

The Bill is great because it provides a framework that enables everything else that hon. Members in the House and people across the country may want to be enacted at a future date. It also enables the power to make those judgments to remain with this House—the democratically elected representatives of the people—rather than some grey bureaucratic body or commercial company whose primary interest is rightly to make vast sums of money for its shareholders. It is not for them to decide; it is for us to decide what is legal and what should be allowed to be viewed in public.

On amendment 152, which interacts with new clause 11, I was in the IT industry for about 15 to 20 years before coming to this place, albeit with a previous generation of technology. When it comes to end-to-end encryption, I am reminded of King Canute, who said, “I’m going to pass a law so that the tide doesn’t come in.” Frankly, we cannot pass a law that bans mathematics, which is effectively what we would be trying to do if we tried to ban encryption. The nefarious types or evildoers who want to hide their criminal activity will simply use mathematics to do that, whether in mainstream social media companies or through a nefarious route. We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse—all the good things that we want in society—on the basis of a tiny minority of very bad people who need to be caught. We should not be seeking to ban encryption; we should be seeking to catch those criminals, and there are ways of doing so.

I welcome the Bill; I am pleased with the new approach and I think it can pass through this House swiftly if we stick together and make the amendments that we need. I have had conversations with the Minister about what I am asking for today: I am looking for an assurance that the Government will enable further debate and table the amendments that they have suggested. I also hope that they will be humble, as my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said, and open to some minor adjustments, even to the current thinking, to make the Bill pass smoothly through the Commons and the Lords.

I would like the Government to confirm that it is part of their vision that it will be this place, not a Minister of State, that decides every year—or perhaps every few months, because technology moves quickly—what new offences need to be identified in law. That will mean that Ofcom and the criminal justice system can get on to that quickly to ensure that the online world is a safer place for our children and a more pleasant place for all of us.

None Portrait Several hon. Members rose—
- Hansard -