All 2 Siobhan Baillie contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 17th Jan 2023
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments

Online Safety Bill

Siobhan Baillie Excerpts
Siobhan Baillie Portrait Siobhan Baillie (Stroud) (Con)
- View Speech - Hansard - -

Thank you, Mr Deputy Speaker.

I rise to speak to amendments 52 and 53. As you know, Mr Deputy Speaker, I have been campaigning to tackle anonymous abuse for many years now. I have been working with the fantastic Clean Up The Internet organisation, Stroud residents and the brilliant Department for Digital, Culture, Media and Sport team. We have been focused on practical measures that will empower social media users to protect themselves from anonymous abuse. I am pleased to say that the Government accepted our campaign proposals to introduce verification options. They give people the option to be followed and to follow only verified accounts if that is what they choose, and to ensure that they know who is and who is not verified. That will also assist in ensuring that the positive parts of anonymity can continue online, as there are many. I respectfully think that that work is even more important now that we have seen the removal of the “legal but harmful” clauses, because we know what will be viewed by children and vulnerable adults who want to be protected online.

We are not resting on that campaign win, however. We want to see the verification measures really work in the real world and for social media companies to adopt them quickly without any confusion about their duties. Separately, clarity is the order of the day, because the regulator Ofcom is going to have an awful lot to do thanks to the excellent clauses throughout the legislation.

This issue is urgent. We must not forget that anonymous social media accounts are spewing out hateful bile every single minute of the day. Children and vulnerable adults are left terrified: it is much more scary for them to receive comments about suicide, self-harm and bullying, and from anorexia pushers, from people when they do not know who they are.

Financial scammers tend to hide behind anonymity. Faceless bots cause mayhem and start nasty pile-ons. Perverts know that when they send a cyber-flashing dick pic to an unsuspecting woman, it is very unlikely, if it comes from an anonymous account, that it will be traced back to them. It is really powerful and important for people to have the tools to not see unverified nonsense or abuse, to be able to switch that off and to know that the people they follow are real.

I am keen for the Minister and the Government to adopt amendments 52 and 53. They are by no means the most sexy and jazzy amendments before the House; they are more tweaks than amendments. They would change the wording to bring the legislation up to date in the light of recent changes. They would also ensure that it is obvious if people are verified—blue ticks are a really good example of that—which was part of my campaign in the first place. I understand from discussions that the Government are considering adopting my amendments. I thank colleagues for calling them sensible and backing them. They are really important.

Finally, I have made the case many times that the public expect us to act and to be strong in this policy area, but they also expect things to happen very quickly. We have waited a very long time. It is incredibly important to give people the power and tools to protect themselves, whether by sliding a button or switching something off. My great hope from the campaigning that I have done is that young people and adults will think about only following unverified accounts through an active choice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that specific point, does the hon. Lady realise that the empowerment duties in respect of verified and non-verified users apply only to adult users? Children will not have the option to toggle off unverified users, because the user empowerment duties do not allow that to happen.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - -

The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.

We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- View Speech - Hansard - - - Excerpts

There is a lot to cover in the short time I have, but first let me thank Members for their contributions to the debate. We had great contributions from the hon. Member for Pontypridd (Alex Davies-Jones), my right hon. Friend the Member for Witham (Priti Patel) and the right hon. Member for Barking (Dame Margaret Hodge)—I have to put that right, having not mentioned her last time—as well as from my hon. Friend the Member for Gosport (Dame Caroline Dinenage); the hon. Member for Aberdeen North (Kirsty Blackman); the former Secretary of State, my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright); and the hon. Members for Plymouth, Sutton and Devonport (Luke Pollard), for Reading East (Matt Rodda) and for Leeds East (Richard Burgon).

I would happily meet the hon. Member for Plymouth, Sutton and Devonport to talk about incel content, as he requested, and the hon. Members for Reading East and for Leeds East to talk about Olly Stephens and Joe Nihill. Those are two really tragic examples and it was good to hear the tributes to them and their being mentioned in this place in respect of the changes in the legislation.

We had great contributions from my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom), the hon. Member for Strangford (Jim Shannon) and my hon. Friend the Member for Dover (Mrs Elphicke). I am glad that my hon. Friend the Member for Stone (Sir William Cash) gave a three-Weetabix speech—I will have to look in the Tea Room for the Weetabix he has been eating.

There were great contributions from my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Great Grimsby (Lia Nici), from my right hon. Friend the Member for Chelmsford (Vicky Ford) and from my hon. Friend the Member for Yeovil (Mr Fysh). The latter talked about doom-scrolling; I recommend that he speaks to my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes), whose quoting of G. K. Chesterton shows the advantages of reading books rather than scrolling through a phone. I also thank my hon. Friends the Members for Redditch (Rachel Maclean), for Watford (Dean Russell) and for Stroud (Siobhan Baillie).

I am also grateful for the contributions during the recommittal process. The changes made to the Bill during that process have strengthened the protections that it can offer.

We reviewed new clause 2 carefully, and I am sympathetic to its aims. We have demonstrated our commitment to strengthening protections for children elsewhere in the Bill by tabling a series of amendments at previous stages, and the Bill already includes provisions to make senior managers liable for failing to prevent a provider from committing an offence and for failing to comply with information notices. We are committed to ensuring that children are safe online, so we will work with those Members and others to bring to the other place an effective amendment that delivers our shared aims of holding people accountable for their actions in a way that is effective and targeted at child safety, while ensuring that the UK remains an attractive place for technology companies to invest and grow.

We need to take time to get this right. We intend to base our amendments on the Irish Online Safety and Media Regulation Act 2022, which, ironically, was largely based on our work here, and which introduces individual criminal liability for failure to comply with the notice to end contravention. In line with that approach, the final Government amendment, at the end of the ping-pong between the other place and this place, will be carefully designed to capture instances in which senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment or fines, will be commensurate with those applying to similar offences. While the amendment will not affect those who have acted in good faith to comply in a proportionate way, it will give the Act additional teeth—as we have heard—to deliver the change that we all want, and ensure that people are held to account if they fail to protect children properly.

As was made clear by my right hon. Friend the Member for Witham, child protection and strong implementation are at the heart of the Bill. Its strongest protections are for children, and companies will be held accountable for their safety. I cannot guarantee the timings for which my right hon. Friend asked, but we will not dilute our commitment. We have already started to speak to companies in this sphere, and I will also continue to work with her and others.

Online Safety Bill

Siobhan Baillie Excerpts
Richard Burgon Portrait Richard Burgon
- Hansard - - - Excerpts

I thank the hon. Gentleman for his intervention. It is important that the Government have announced a strategy: it is part and parcel of the ongoing work that is so necessary when we consider the prevalence of suicide as the leading cause of death among young men and women. It is a scourge across society. People should not make the mistake of thinking that the internet merely showcases awful things. The internet has been used as a tool by exploitative and sometimes disturbed individuals to create more misery and more instances of awful things happening, and to lead others down a dangerous path that sometimes ends, sadly, in them taking their own lives.

I thank the Minister for his engagement with my constituents, and the shadow Minister for what she has done. I also thank Baroness Kidron, Baroness Morgan and hon. Members who have engaged with this issue. I urge the Government to see the Bill not as the end when it comes to tackling dangerous online content related to suicide and self-harm, but as part of an important ongoing journey that we all work on together.

Siobhan Baillie Portrait Siobhan Baillie (Stroud) (Con)
- View Speech - Hansard - -

I rise to speak to Lords amendment 231 on visible identity verification. I will not press the amendment to a vote. I have had several discussions with Ministers and the Secretary of State, and I am grateful for their time. I will explain a little more.

The dry nature of the amendment masks the fact that the issue of identity verification—or lack of it—affects millions of people around the country. We increasingly live our lives online, so the public being able to know who is or is not a real person online is a key part of the UK being the safest the place to be on the internet, which is the Bill’s ambition. Unfortunately, too often it feels as though we have to wade through nutters, bots, fake accounts and other nasties before coming to a real person we want to hear from. The Bill takes huge steps to empower users to change that, but there is more to do.

Hon. Members will recall that I have campaigned for years to tackle anonymous abuse. I thank Stroud constituents, celebrities and parents who have brought to me sad stories that I have conveyed to the House involving abuse about the deaths of babies and children and about disabled children. That is absolutely awful.

Alongside a smart Stroud constituent and Clean Up The Internet—a fantastic organisation—we have fought and argued for social media users to have the option of being verified online; for them to be able to follow and be followed only by verified accounts, if that is what they want; and, crucially, to make it clear who is and is not verified online. People can still be Princess Unicorn if they want, but at the back end, their address and details can be held, and that will give confidence.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend is making a powerful case. Umberto Eco, the Italian philosopher, described the internet as the empire of imbeciles, and much of social media is indeed imbecilic—but it is much worse than that. My hon. Friend is right that the internet provides a hiding place for the kind of malevolence she has described. Does she agree that the critical thing is for the Government to look again at the responsibility of those who publish this material? If it were written material, the publisher would have a legal liability. That is not true of internet companies. Is that a way forward?

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - -

I am interested in that intervention, but I fear it would lead us into a very long discussion and I want to keep my comments focused on my amendment. However, it would be interesting to hear from the Minister in response to that point, because it is a huge topic for debate.

On the point about whether someone is real or not real online, I believe passionately that not only famous people or those who can afford it should be able to show that they are a real and verified person. I say, “Roll out the blue ticks.”—or the equivalents—and not just to make the social media performs more money; as we have seen, we need it as a safety mechanism and a personal responsibility mechanism.

All the evidence and endless polling show that the public want to know who is and who is not real online, and it does not take rocket science to understand why. Dealing with faceless, anonymous accounts is very scary and anonymous abusers are terrifying. Parents are worried that they do not know who their children are speaking to, and anonymous, unverified accounts cannot be traced if details are not held.

That is before we get to how visible verification can help to tackle fraud. We should empower people to avoid fake accounts. We know that people are less likely to engage with an unverified account, and it would make it easy to catch scammers. Fraud was the most common form of crime in 2022, with 41% of all crimes being fraud, 23% of all reported fraud being initiated on social media and 80% of fraud being cyber-related. We can imagine just how fantastically clever the scams will become through AI.

Since we started this process, tech companies have recognised the value of identity verification to the public, so much so that they now sell it on Twitter as blue ticks, and the Government understand the benefits of identity verification options. The Government have done a huge amount of work on that. I thank them for agreeing to two of the three pillars of my campaign, and I believe we can get there on visibility; I know from discussions with Government that Ofcom will be looking carefully at that.

Making things simple for social media users is incredibly important. For the user verification provisions in this Bill to fulfil their potential and prevent harm, including illegal harm, we believe that users need to be able to see who is and is not verified—that is, who is a real person—and all the evidence says that that is what the public wants.

While Ministers in this place and the other place have resisted putting visible verification on the face of the Bill, I am grateful to the Government for their work on this. After a lot of to-ing and fro-ing, we are reassured that the Bill as now worded gives Ofcom the powers to do what the public wants and what we are suggesting through codes and guidance. We hope that Ofcom will consider the role of anonymous, inauthentic and non-verified accounts as it prepares its register of risks relating to illegal content and in its risk profiles.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I pay tribute to the way my hon. Friend has focused on this issue through so many months and years. Does she agree that, in light of the assurances that she has had from the Minister, this is just the sort of issue that either a stand-alone committee or some kind of scrutiny group could keep an eye on? If those guidelines do not work as the Minister is hoping, the action she has suggested will need to be taken.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - -

Absolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.

To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

As others have done, I welcome the considerable progress made on the Bill in the other place, both in the detailed scrutiny that it has received from noble Lords, who have taken a consistent and expert interest in it, and in the positive and consensual tone adopted by Opposition Front Benchers and, crucially, by Ministers.

It seems that there are very few Members of this House who have not had ministerial responsibility for the Bill at some point in what has been an extraordinarily extensive relay race as it has moved through its legislative stages. The anchor leg—the hardest bit in such a Bill—has been run with dedication and skill by my right hon. Friend the Secretary of State, who deserves all the praise that she will get for holding the baton as we cross the parliamentary finish line, as I hope we are close to doing.

I have been an advocate of humility in the way in which we all approach this legislation. It is genuinely difficult and novel territory. In general, I think that my right hon. Friend the Secretary of State and her Ministers—the noble Lord Parkinson and, of course, the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully)—have been willing to change their minds when it was right to do so, and the Bill is better for it. Like others who have dealt with them, I also thank the officials, some of whom sit in the Box, some of whom do not. They have dedicated—as I suspect they would see it—most of their lives to the generation of the Bill, and we are grateful to them for their commitment.

Of course, as others have said, none of this means that the Bill is perfect; frankly, it was never going to be. Nor does it mean that when we pass the Bill, the job is done. We will then pass the baton to Ofcom, which will have a large amount of further work to do. However, we now need to finalise the legislative phase of this work after many years of consideration. For that reason, I welcome in particular what I think are sensible compromises on two significant issues that had yet to be resolved: first, the content of children’s risk assessments, and secondly, the categorisation process. I hope that the House will bear with me while I consider those in detail, which we have not yet done, starting with Lords amendments 17, 20 and 22, and Lords amendment 81 in relation to search, as well as the Government amendments in lieu of them.

Those Lords amendments insert harmful “features, functionalities or behaviours” into the list of matters that should be considered in the children’s risk assessment process and in the meeting of the safety duties, to add to the harms arising from the intrinsic nature of content itself—that is an important change. As others have done, I pay great tribute to the noble Baroness Kidron, who has invariably been the driving force behind so many of the positive enhancements to children’s online safety that the Bill will bring. She has promoted this enhancement, too. As she said, it is right to recognise and reflect in the legislation that a child’s online experience can be harmful not just as a result of the harm an individual piece of content can cause, but in the way that content is selected and presented to that child—in other words, the way in which the service is designed to operate. As she knows, however, I part company with the Lords amendments in the breadth of the language used, particularly the word “behaviours”.

Throughout our consideration of the Bill, I have taken the view that we should be less interested in passing legislation that sounds good and more interested in passing legislation that works. We need the regulator to be able to encourage and enforce improvements in online safety effectively. That means asking the online platforms to address the harms that it is within their power to address, and to relate clearly the design or operation of the systems that they have put in place.

The difficulty with the wording of the Lords amendments is that they bring into the ambit of the legislation behaviours that are not necessarily enabled or created by the design or operation of the service. The language used is

“features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”—

in other words, not limited to those that are enabled or created by the service. It is a step too far to make platforms accountable for all behaviours that are harmful to children without the clarity of that link to what the platform has itself done. For that reason, I cannot support those Lords amendments.

However, the Government have proposed a sensible alternative approach in their amendments in lieu, particularly in relation to Lords amendments 17 and Lords amendment 81, which relates to search services. The Government amendments in lieu capture the central point that design of a service can lead to harm and require a service to assess that as part of the children’s risk assessment process. That is a significant expansion of a service’s responsibilities in the risk assessment process which reflects not just ongoing concern about types of harm that were not adequately captured in the Bill so far but the positive moves we have all sought to make towards safety by design as an important preventive concept in online safety.

I also think it is important, given the potential scale of this expanded responsibility, to make clear that the concept of proportionality applies to a service’s approach to this element of assessment and mitigation of risk, as it does throughout the Bill, and I hope the Minister will be able to do that when he winds up the debate.