All 2 Sarah Owen contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting

ONLINE SAFETY BILL (First sitting)

Sarah Owen Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I beg to move amendment 98, in clause 11, page 10, line 17, at end insert

“, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Thank you, Sir Roger, for chairing this recommitted Bill Committee. I will not say that it is nice to be back discussing the Bill again; we had all hoped to have made more progress by now. If you will indulge me for a second, I would like to thank the Clerks, who have been massively helpful in ensuring that this quick turnaround could happen and that we could table the amendments in a sensible form.

Amendment 98 arose from comments and evidence from the Royal College of Psychiatrists highlighting that a number of platforms, and particularly social media platforms such as TikTok and Facebook, generally encourage habit-forming behaviour or have algorithms that encourage it. Such companies are there to make money—that is what companies do—so they want people to linger on their sites and to spend as much time there as possible.

I do not know how many hon. Members have spent time on TikTok, but if they do, and they enjoy some of the cat videos, for instance, the algorithm will know and will show them more videos of cats. They will sit there and think, “Gosh, where did the last half-hour go? I have been watching any number of 20-second videos about cats, because they constantly come up.” Social media sites work by encouraging people to linger on the site and to spend the time dawdling and looking at the advertisements, which make the company additional revenue.

That is good for capitalism and for the company’s ability to make money but the issue, particularly in relation to clause 11, is how that affects children. Children may not have the necessary filters; they may not have the ability that we have to put our phones down—not that we always manage to do so. That ability and decision-making process may not be as refined in children as in adults. Children can be sucked into the platforms by watching videos of cats or of something far more harmful.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - -

The hon. Member makes an excellent point about TikTok, but it also applies to YouTube. The platforms’ addictive nature has to do with the content. A platform does not just show a person a video of a cat, because that will not keep them hooked for half an hour. It has to show them a cat doing something extraordinary, and then a cat doing something even more extraordinary. That is why vulnerable people, especially children, get sucked into a dark hole. They click to see not just the same video but something more exciting, and then something even more exciting. That is the addictive nature of this.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is absolutely the case. We are talking about cats because I chose them to illustrate the situation, but people may look at content about healthy eating, and that moves on to content that encourages them to be sick. The way the algorithms step it up is insidious; they get more and more extreme, so that the linger time is increased and people do not get bored. It is important that platforms look specifically at their habit-forming features.

--- Later in debate ---
Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir Roger.

Being online can be a hugely positive experience for children and young people, but we recognise the challenge of habit-forming behaviour or designed addiction to some digital services. The Bill as drafted, however, would already deliver the intent of the amendment from the hon. Member for Aberdeen North. If service providers identify in their risk assessment that habit-forming or addictive-behaviour risks cause significant harm to an appreciable number of children on a service, the Bill will require them to put in place measures to mitigate and manage that risk under clause 11(2)(a).

To meet the child safety risk assessment duties under clause 10, services must assess the risk of harm to children from the different ways in which the service is used; the impact of such use; the level of risk of harm to children; how the design and operation of the service may increase the risks identified; and the functionalities that facilitate the presence or dissemination of content of harm to children. The definition of “functionality” at clause 200 already includes an expression of a view on content, such as applying a “like” or “dislike” button, as at subsection (2)(f)(i).

Sarah Owen Portrait Sarah Owen
- Hansard - -

I thank the Minister for giving way so early on. He mentioned an “appreciable number”. Will he clarify what that is? Is it one, 10, 100 or 1,000?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think that a single number can be put on that, because it depends on the platform and the type of viewing. It is not easy to put a single number on that. An “appreciable number” is basically as identified by Ofcom, which will be the arbiter of all this. It comes back to what the hon. Member for Aberdeen North said about the direction that we, as she rightly said, want to give Ofcom. Ofcom has a range of powers already to help it assess whether companies are fulfilling their duties, including the power to require information about the operation of their algorithms. I would set the direction that the hon. Lady is looking for, to ensure that Ofcom uses those powers to the fullest and can look at the algorithms. We should bear in mind that social media platforms face criminal liability if they do not supply the information required by Ofcom to look under the bonnet.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Sarah Owen Portrait Sarah Owen
- Hansard - -

The hon. Member is making fantastic, salient points. The damage with private messaging is around phishing, as well as seeing a really harmful message and not being able to unsee it. Would she agree that it is about protecting the victim, not putting the onus on the victim to disengage from such conversations?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

ONLINE SAFETY BILL (Second sitting)

Sarah Owen Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. I did not make a note of the specific word I was on when we adjourned, so I hope Hansard colleagues will forgive me if the flow between what I said previously and what I say now is somewhat stilted.

I will keep this brief, because I was—purposefully—testing the patience of the Minister with some of my contributions. However, I did so to hammer home the fact that the removal of clauses 12 and 13 from the Bill is a fatal error. If the recommittal of the Bill is not to fundamentally undermine what the Bill set out to do five years or so ago, their removal should urgently be reconsidered. We have spent five years debating the Bill to get it to this point.

As I said, there are forms of harm that are not illegal, but they are none the less harmful, and they should be legislated for. They should be in the Bill, as should specific protections for adults, not just children. I therefore urge the Minister to keep clauses 12 and 13 in the Bill so that we do not undermine what it set out to do and all the work that has been done up to this point. Inexplicably, the Government are trying to undo that work at this late stage before the Bill becomes law.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - -

It is a pleasure to see you in the Chair, Dame Angela—I wish it was a toastier room. Let me add to the points that the shadow Minister, my hon. Friend the Member for Pontypridd, made so powerfully about vulnerable people. There is no cliff edge when such a person becomes 18. What thought have the Minister and the Department given to vulnerable young adults with learning disabilities or spectrum disorders? Frankly, the idea that, as soon as a person turns 18, they are magically no longer vulnerable is for the birds—particularly when it comes to eating disorders, suicide and self-harm.

Adults do not live in isolation, and they do not just live online. We have a duty of care to people. The perfect example is disinformation, particularly when it comes to its harmful impact on public health. We saw that with the pandemic and vaccine misinformation. We saw it with the harm done to children by the anti-vaccine movement’s myths about vaccines, children and babies. It causes greater harm than just having a conversation online.

People do not stay in one lane. Once people start being sucked into conspiracy myths, much as we discussed earlier around the algorithms that are used to keep people online, it has to keep ramping up. Social media and tech companies do that very well. They know how to do it. That is why I might start looking for something to do with ramen recipes and all of a sudden I am on to a cat that has decided to make noodles. It always ramps up. That is the fun end of it, but on the serious end somebody will start to have doubts about certain public health messages the Government are sending out. That then tips into other conspiracy theories that have really harmful, damaging consequences.

I saw that personally. My hon. Friend the Member for Warrington North eloquently put forward some really powerful examples of what she has been subjected to. With covid, some of the anti-vaccinators and anti-mask-wearers who targeted me quickly slipped into Sinophobia and racism. I was sent videos of people eating live animals, and being blamed for a global pandemic.

The people who have been targeted do not stay in one lane. The idea that adults are not vulnerable, and susceptible, to such targeting and do not need protection from it is frankly for the birds. We see that particularly with extremism, misogyny and the incel culture. I take the point from our earlier discussion about who determines what crosses the legal threshold, but why do we have to wait until somebody is physically hurt before the Government act?

That is really regrettable. So, too, is the fact that this is such a huge U-turn in policy, with 15% of the Bill coming back to Committee. As we have heard, that is unprecedented, and yet, on the most pivotal point, we were unable to hear expert advice, particularly from the National Society for the Prevention of Cruelty to Children, Barnardo’s and the Antisemitism Policy Trust. I was struggling to understand why we would not hear expert advice on such a drastic change to an important piece of legislation—until I heard the hon. Member for Don Valley talk about offence. This is not about offence; it is about harm.

The hon. Member’s comments highlighted perfectly the real reason we are all here in a freezing cold Bill Committee, rehashing work that has already been solved. The Bill was not perfect by any stretch of the imagination, but it was better than what we have today. The real reason we are here is the fight within the Conservative party.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

No such fight has taken place. These are my personal views, and I genuinely believe that people have a right to say what they would like to say. That is free speech. There have been no fights whatever.

Sarah Owen Portrait Sarah Owen
- Hansard - -

In that case, I must have been mistaken in thinking that the hon. Member—who has probably said quite a lot of things, which is why his voice is as hoarse as it is—was criticising the former Minister for measures that were agreed in previous Committee sittings.

For me, the current proposals are a really disappointing, retrograde step. They will not protect the most vulnerable people in our communities, including offline—this harm is not just online, but stretches out across all our communities. What happens online does not take place, and stay, in an isolated space; people are influenced by it and take their cues from it. They do not just take their cues from what is said in Parliament; they see misogynists online and think that they can treat people like that. They see horrific abuses of power and extreme pornography and, as we heard from the hon. Member for Aberdeen North, take their cues from that. What happens online does not stay online.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

My hon. Friend makes an important point about what happens online and its influence on the outside world. We saw that most recently with Kanye West being reinstated to Twitter and allowed to spew his bile and abhorrent views about Jews. That antisemitism had a real-world impact in terms of the rise in antisemitism on the streets, particularly in the US. The direct impact of his being allowed to talk about that online was Jews being harmed in the real world. That is exactly what is happening.

Sarah Owen Portrait Sarah Owen
- Hansard - -

I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.

That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Dame Angela.

A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.

The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.

Sarah Owen Portrait Sarah Owen
- Hansard - -

Would that not also apply to vaccine efficacy? If we are talking about everything being up for debate and nothing being a hard fact, we are entering slightly strange worlds where we undo a huge amount of progress, in particular on health.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The amendment does not talk about vaccine efficacy; it talks about content that is harmful to health. That is a wide-ranging thing.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.

In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.

Sarah Owen Portrait Sarah Owen
- Hansard - -

The fact that we are here again to discuss what one Secretary of State wanted to put into law, and which another is now seeking to remove before the law has even been introduced, suggests that my hon. Friend’s point about protection and making sure that there are adequate measures within which the Secretary of State must operate is absolutely valid.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.

We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.

Sarah Owen Portrait Sarah Owen
- Hansard - -

On that point, which we discussed earlier—my hon. Friend the Member for Warrington North discussed it—I am struggling to understand what is an acceptable level of harm, and what is the acceptable number of people to be harmed, before a platform has to act.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It totally depends on the scenario. It is very difficult for me to stand here now and give a wide number of examples, but the Secretary of State will be reacting to a given situation, rather than trying to predict them.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Sarah Owen Portrait Sarah Owen
- Hansard - -

I want to add to the brilliant points made by my hon. Friend the shadow Minister, in particular on the continually changing nature of market forces, which the Minister himself referenced. We want innovation. We want the tech companies to innovate—preferably ones in the UK—but we do not want to be playing catch-up as we are now, making legislation retrospectively to right wrongs that have taken place because our legislative process has been too slow to deal with the technological changes and the changes in social media, in apps, and with how we access data and communicate with one another online. The bare minimum is a biannual report.

Within six months, if a new piece of technology comes up, it does not simply stay with one app or platform; that technology will be leapfrogged by others. Such technological advances can take place at a very rapid pace. The transparency aspect is important, because people should have a right to know what they are using and whether it is safe. We as policy makers should have a right to know clearly whether the legislation that we have introduced, or the legislation that we want to amend or update, is effective.

If we look at any other approach that we take to protect the health and safety of the people in our country—the people we all represent in our constituencies —we always say that prevention is better than cure. At the moment, without transparency and without researchers being able to update the information we need to see, we will constantly be playing catch-up with digital tech.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.