Online Safety Bill Debate
Full Debate: Read Full DebateSiobhan Baillie
Main Page: Siobhan Baillie (Conservative - Stroud)Department Debates - View all Siobhan Baillie's debates with the Department for Science, Innovation & Technology
(1 year, 3 months ago)
Commons ChamberI thank the hon. Gentleman for his intervention. It is important that the Government have announced a strategy: it is part and parcel of the ongoing work that is so necessary when we consider the prevalence of suicide as the leading cause of death among young men and women. It is a scourge across society. People should not make the mistake of thinking that the internet merely showcases awful things. The internet has been used as a tool by exploitative and sometimes disturbed individuals to create more misery and more instances of awful things happening, and to lead others down a dangerous path that sometimes ends, sadly, in them taking their own lives.
I thank the Minister for his engagement with my constituents, and the shadow Minister for what she has done. I also thank Baroness Kidron, Baroness Morgan and hon. Members who have engaged with this issue. I urge the Government to see the Bill not as the end when it comes to tackling dangerous online content related to suicide and self-harm, but as part of an important ongoing journey that we all work on together.
I rise to speak to Lords amendment 231 on visible identity verification. I will not press the amendment to a vote. I have had several discussions with Ministers and the Secretary of State, and I am grateful for their time. I will explain a little more.
The dry nature of the amendment masks the fact that the issue of identity verification—or lack of it—affects millions of people around the country. We increasingly live our lives online, so the public being able to know who is or is not a real person online is a key part of the UK being the safest the place to be on the internet, which is the Bill’s ambition. Unfortunately, too often it feels as though we have to wade through nutters, bots, fake accounts and other nasties before coming to a real person we want to hear from. The Bill takes huge steps to empower users to change that, but there is more to do.
Hon. Members will recall that I have campaigned for years to tackle anonymous abuse. I thank Stroud constituents, celebrities and parents who have brought to me sad stories that I have conveyed to the House involving abuse about the deaths of babies and children and about disabled children. That is absolutely awful.
Alongside a smart Stroud constituent and Clean Up The Internet—a fantastic organisation—we have fought and argued for social media users to have the option of being verified online; for them to be able to follow and be followed only by verified accounts, if that is what they want; and, crucially, to make it clear who is and is not verified online. People can still be Princess Unicorn if they want, but at the back end, their address and details can be held, and that will give confidence.
My hon. Friend is making a powerful case. Umberto Eco, the Italian philosopher, described the internet as the empire of imbeciles, and much of social media is indeed imbecilic—but it is much worse than that. My hon. Friend is right that the internet provides a hiding place for the kind of malevolence she has described. Does she agree that the critical thing is for the Government to look again at the responsibility of those who publish this material? If it were written material, the publisher would have a legal liability. That is not true of internet companies. Is that a way forward?
I am interested in that intervention, but I fear it would lead us into a very long discussion and I want to keep my comments focused on my amendment. However, it would be interesting to hear from the Minister in response to that point, because it is a huge topic for debate.
On the point about whether someone is real or not real online, I believe passionately that not only famous people or those who can afford it should be able to show that they are a real and verified person. I say, “Roll out the blue ticks.”—or the equivalents—and not just to make the social media performs more money; as we have seen, we need it as a safety mechanism and a personal responsibility mechanism.
All the evidence and endless polling show that the public want to know who is and who is not real online, and it does not take rocket science to understand why. Dealing with faceless, anonymous accounts is very scary and anonymous abusers are terrifying. Parents are worried that they do not know who their children are speaking to, and anonymous, unverified accounts cannot be traced if details are not held.
That is before we get to how visible verification can help to tackle fraud. We should empower people to avoid fake accounts. We know that people are less likely to engage with an unverified account, and it would make it easy to catch scammers. Fraud was the most common form of crime in 2022, with 41% of all crimes being fraud, 23% of all reported fraud being initiated on social media and 80% of fraud being cyber-related. We can imagine just how fantastically clever the scams will become through AI.
Since we started this process, tech companies have recognised the value of identity verification to the public, so much so that they now sell it on Twitter as blue ticks, and the Government understand the benefits of identity verification options. The Government have done a huge amount of work on that. I thank them for agreeing to two of the three pillars of my campaign, and I believe we can get there on visibility; I know from discussions with Government that Ofcom will be looking carefully at that.
Making things simple for social media users is incredibly important. For the user verification provisions in this Bill to fulfil their potential and prevent harm, including illegal harm, we believe that users need to be able to see who is and is not verified—that is, who is a real person—and all the evidence says that that is what the public wants.
While Ministers in this place and the other place have resisted putting visible verification on the face of the Bill, I am grateful to the Government for their work on this. After a lot of to-ing and fro-ing, we are reassured that the Bill as now worded gives Ofcom the powers to do what the public wants and what we are suggesting through codes and guidance. We hope that Ofcom will consider the role of anonymous, inauthentic and non-verified accounts as it prepares its register of risks relating to illegal content and in its risk profiles.
I pay tribute to the way my hon. Friend has focused on this issue through so many months and years. Does she agree that, in light of the assurances that she has had from the Minister, this is just the sort of issue that either a stand-alone committee or some kind of scrutiny group could keep an eye on? If those guidelines do not work as the Minister is hoping, the action she has suggested will need to be taken.
Absolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.
To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.
As others have done, I welcome the considerable progress made on the Bill in the other place, both in the detailed scrutiny that it has received from noble Lords, who have taken a consistent and expert interest in it, and in the positive and consensual tone adopted by Opposition Front Benchers and, crucially, by Ministers.
It seems that there are very few Members of this House who have not had ministerial responsibility for the Bill at some point in what has been an extraordinarily extensive relay race as it has moved through its legislative stages. The anchor leg—the hardest bit in such a Bill—has been run with dedication and skill by my right hon. Friend the Secretary of State, who deserves all the praise that she will get for holding the baton as we cross the parliamentary finish line, as I hope we are close to doing.
I have been an advocate of humility in the way in which we all approach this legislation. It is genuinely difficult and novel territory. In general, I think that my right hon. Friend the Secretary of State and her Ministers—the noble Lord Parkinson and, of course, the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully)—have been willing to change their minds when it was right to do so, and the Bill is better for it. Like others who have dealt with them, I also thank the officials, some of whom sit in the Box, some of whom do not. They have dedicated—as I suspect they would see it—most of their lives to the generation of the Bill, and we are grateful to them for their commitment.
Of course, as others have said, none of this means that the Bill is perfect; frankly, it was never going to be. Nor does it mean that when we pass the Bill, the job is done. We will then pass the baton to Ofcom, which will have a large amount of further work to do. However, we now need to finalise the legislative phase of this work after many years of consideration. For that reason, I welcome in particular what I think are sensible compromises on two significant issues that had yet to be resolved: first, the content of children’s risk assessments, and secondly, the categorisation process. I hope that the House will bear with me while I consider those in detail, which we have not yet done, starting with Lords amendments 17, 20 and 22, and Lords amendment 81 in relation to search, as well as the Government amendments in lieu of them.
Those Lords amendments insert harmful “features, functionalities or behaviours” into the list of matters that should be considered in the children’s risk assessment process and in the meeting of the safety duties, to add to the harms arising from the intrinsic nature of content itself—that is an important change. As others have done, I pay great tribute to the noble Baroness Kidron, who has invariably been the driving force behind so many of the positive enhancements to children’s online safety that the Bill will bring. She has promoted this enhancement, too. As she said, it is right to recognise and reflect in the legislation that a child’s online experience can be harmful not just as a result of the harm an individual piece of content can cause, but in the way that content is selected and presented to that child—in other words, the way in which the service is designed to operate. As she knows, however, I part company with the Lords amendments in the breadth of the language used, particularly the word “behaviours”.
Throughout our consideration of the Bill, I have taken the view that we should be less interested in passing legislation that sounds good and more interested in passing legislation that works. We need the regulator to be able to encourage and enforce improvements in online safety effectively. That means asking the online platforms to address the harms that it is within their power to address, and to relate clearly the design or operation of the systems that they have put in place.
The difficulty with the wording of the Lords amendments is that they bring into the ambit of the legislation behaviours that are not necessarily enabled or created by the design or operation of the service. The language used is
“features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”—
in other words, not limited to those that are enabled or created by the service. It is a step too far to make platforms accountable for all behaviours that are harmful to children without the clarity of that link to what the platform has itself done. For that reason, I cannot support those Lords amendments.
However, the Government have proposed a sensible alternative approach in their amendments in lieu, particularly in relation to Lords amendments 17 and Lords amendment 81, which relates to search services. The Government amendments in lieu capture the central point that design of a service can lead to harm and require a service to assess that as part of the children’s risk assessment process. That is a significant expansion of a service’s responsibilities in the risk assessment process which reflects not just ongoing concern about types of harm that were not adequately captured in the Bill so far but the positive moves we have all sought to make towards safety by design as an important preventive concept in online safety.
I also think it is important, given the potential scale of this expanded responsibility, to make clear that the concept of proportionality applies to a service’s approach to this element of assessment and mitigation of risk, as it does throughout the Bill, and I hope the Minister will be able to do that when he winds up the debate.