Online Safety Bill Debate
Full Debate: Read Full DebateKirsty Blackman
Main Page: Kirsty Blackman (Scottish National Party - Aberdeen North)Department Debates - View all Kirsty Blackman's debates with the Department for Science, Innovation & Technology
(1 year, 2 months ago)
Commons ChamberThe right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.
On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.
The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?
I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.
I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.
There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.
There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.
It is a pleasure to speak during what I hope are the final stages of the Bill. Given that nearly all the Bills on which I have spoken up to now have been money Bills, this business of “coming back from the Lords” and scrutinising Lords amendments has not been part of my experience, so if I get anything wrong, I apologise.
Like other Members, I want to begin by thanking a number of people and organisations, including the Mental Health Foundation, Carnegie UK, the Internet Watch Foundation, the National Society for the Prevention of Cruelty to Children and two researchers for the SNP, Aaron Lucas and Josh Simmonds-Upton, for all their work, advice, knowledge and wisdom. I also join the hon. Members for Pontypridd (Alex Davies-Jones) and for Gosport (Dame Caroline Dinenage) in thanking the families involved for the huge amount of time and energy—and the huge amount of themselves—that they have had to pour into the process in order to secure these changes. This is the beginning of the culmination of all their hard work. It will make a difference today, and it will make a difference when the Bill is enacted. Members in all parts of the House will do what we can to continue to scrutinise its operation to ensure that it works as intended, to ensure that children are kept as safe as possible online, and to ensure that Ofcom uses these powers to persuade platforms to provide the information that they will be required to provide following the death of a child about that child’s use of social media.
The Bill is about keeping people safe. It is a different Bill from the one that began its parliamentary journey, I think, more than two years ago. I have seen various Ministers leading from the Dispatch Box during that time, but the voices around the Chamber have been consistent, from the Conservative, Labour and SNP Benches. All the Members who have spoken have agreed that we want the internet to be a safer place. I am extremely glad that the Government have made so many concessions that the Opposition parties called for. I congratulate the hon. Member for Pontypridd on the inclusion of violence against women and girls in the Bill. She championed that in Committee, and I am glad that the Government have made the change.
Another change that the Government have made relates to small high-risk platforms. Back in May or June last year I tabled amendments 80, 81 and 82, which called for that categorisation to be changed so that it was not based just on the number of users. I think it was the hon. Member for Gosport who mentioned 4chan, and I have mentioned Kiwi Farms a number of times in the Chamber. Such organisations cannot be allowed to get away with horrific, vile content that encourages violence. They cannot be allowed a lower bar just because they have a smaller number of users.
The National Risk Register produced by the Cabinet Office—great bedtime reading which I thoroughly recommend—states that both the risk and the likelihood of harm and the number of people on whom it will have an impact should be taken into account before a decision is made. It is therefore entirely sensible for the Government to take into account both the number of users, when it is a significant number, and the extremely high risk of harm caused by some of these providers.
The hon. Lady is making an excellent speech, but it is critical to understand that this is not just about wickedness that would have taken place anyway but is now taking place on the internet; it is about the internet catalysing and exaggerating that wickedness, and spawning and encouraging all kinds of malevolence. We have a big responsibility in this place to regulate, control and indeed stop this, and the hon. Lady is right to emphasise that.
The right hon. Gentleman is entirely correct. Whether it involves a particularly right-wing cause or antisemitism—or, indeed, dieting content that drags people into something more radical in relation to eating disorders—the bubble mentality created by these algorithms massively increases the risk of radicalisation, and we therefore have an increased duty to protect people.
As I have said, I am pleased to see the positive changes that have been made as a result of Opposition pressure and the uncompromising efforts of those in the House of Lords, especially Baroness Kidron, who has been nothing short of tenacious. Throughout the time in which we have been discussing the Bill, I have spoken to Members of both Houses about it, and it has been very unusual to come across anyone who knows what they are talking about, and, in particular, has the incredible depth of knowledge, understanding and wisdom shown by Baroness Kidron. I was able to speak to her as someone who practically grew up on the internet—we had it at home when I was eight—but she knew far more about it than I did. I am extremely pleased that the Government have worked with her to improve the Bill, and have accepted that she has a huge breadth of knowledge. She managed to do what we did not quite manage to do in this House, although hopefully we laid the foundations.
I want to refer to a number of points that were mentioned by the Minister and are also mentioned in the letters that the Government provided relating to the Lords amendments. Algorithmic scrutiny is incredibly important, and I, along with other Members, have raised it a number of times—again, in connection with concern about radicalisation. Some organisations have been doing better things recently. For instance, someone who searches for something may begin to go down a rabbit hole. Some companies are now putting up a flag, for instance a video, suggesting that users are going down a dark hole and should look at something a bit lighter, and directing them away from the autoplaying of the more radical content. If all organisations, or at least a significant number—particularly those with high traffic—can be encouraged to take such action rather than allowing people to be driven to more extreme content, that will be a positive step.
I was pleased to hear about the upcoming researcher access report, and about the report on app stores. I asked a previous Minister about app stores a year or so ago, and the Minister said that they were not included, and that was the end of it. Given the risk that is posed by app stores, the fact that they were not categorised as user-to-user content concerned me greatly. Someone who wants to put something on an Apple app store has to jump through Apple’s hoops. The content is not owned by the app store, and the same applies to some of the material on the PlayStation store. It is owned by the person who created the content, and it is therefore user-to-user content. In some cases, it is created by one individual. There is no ongoing review of that. Age-rating is another issue: app stores choose whatever age they happen to decide is the most important. Some of the dating apps, such as match.com, have been active in that regard, and have made it clear that their platforms are not for under-16s or under-18s, while the app store has rated the content as being for a younger age than the users’ actual age. That is of concern, especially if the companies are trying to improve age-rating.
On the subject of age rating, I am pleased to see more in the Bill about age assurance and the frameworks. I am particularly pleased to see what is going to happen in relation to trying to stop children being able to access pornography. That is incredibly important but it had been missing from the Bill. I understand that Baroness Floella Benjamin has done a huge amount of work on pushing this forward and ensuring that parliamentarians are briefed on it, and I thank her for the work that she has done. Human trafficking has also been included. Again, that was something that we pushed for, and I am glad to see that it has been put on the face of the Bill.
I want to talk briefly about the review mechanisms, then I will go on to talk about end-to-end encryption. I am still concerned that the review mechanisms are not strong enough. We have pushed to have a parliamentary Committee convened, for example, to review this legislation. This is the fastest moving area of life. Things are changing so dramatically. How many people in here had even heard of ChatGPT a year and a half ago? How many people had used a virtual reality headset? How many people had accessed Rec Room of any of the other VR systems? I understand that the Government have genuinely tried their best to make the Bill as future-proof as possible, but we have no parliamentary scrutiny mechanisms written in. I am not trying to undermine the work of the Committee on this—I think it is incredibly important—but Select Committees are busy and they have no legislative power in this regard. If the Government had written in a review, that would have been incredibly helpful.
The hon. Lady is making a very good speech. When I first came to this House, which was rather a long time ago now, there was a Companies Act every year, because company law was changing at the time, as was the nature of post-war capitalism. It seems to me that there is a strong argument for an annual Act on the handling and management of the internet. What she is saying is exactly right, and that is probably where we will end up.
I completely support the right hon. Member’s point—I would love to see this happening on an annual basis. I am sure that the Ministers who have shepherded the Bill through would be terrified of that, and that the Government team sitting over there are probably quaking in their boots at the suggestion, but given how fast this moves, I think that this would be incredibly important.
The Government’s record on post-implementation reviews of legislation is pretty shoddy. If you ask Government Departments what percentage of legislation they have put through a post-implementation review in the timeline they were supposed to, they will say that it is very small. Some Departments are a bit better than others, but given the number of reshuffles there have been, some do not even know which pieces of legislation they are supposed to be post-implementation reviewing. I am concerned that this legislation will get lost, and that there is no legislative back-up to any of the mechanisms for reviewing it. The Minister has said that it will be kept under review, but can we have some sort of governmental commitment that an actual review will take place, and that legislation will be forthcoming if necessary, to ensure that the implementation of this Bill is carried out as intended? We are not necessarily asking the Government to change it; we are just asking them to cover all the things that they intend it to cover.
On end-to-end encryption, on child sexual exploitation and abuse materials, and on the last resort provider—I have been consistent with every Minister I have spoken to across the Dispatch Box and every time I have spoken to hon. Members about this—when there is any use of child sexual exploitation material or child sexual abuse material, we should be able to require the provider to find it. That absolutely trumps privacy. The largest increase in child sexual abuse material is in self-generated content. That is horrific. We are seeing a massive increase in that number. We need providers to be able to search—using the hash numbers that they can categorise images with, or however they want to do it—for people who are sharing this material in order to allow the authorities to arrest them and put them behind bars so that they cannot cause any more harm to children. That is more important than any privacy concerns. Although Ministers have not put it in the Bill until this point, they have, to their credit, been clear that that is more important than any privacy concerns, and that protecting children trumps those concerns when it comes to abuse materials and exploitation. I am glad to see that that is now written into the Bill; it is important that it was not just stated at the Dispatch Box, even though it was mentioned by a number of Members.
It is very kind of you to call me to speak, Mr Deputy Speaker. I apologise to your good self, to the Minister and to the House for arriving rather tardily.
My daughter and her husband have been staying with me over the past few days. When I get up to make my wife and myself an early-morning cup of tea, I find my two grandchildren sitting in the kitchen with their iPads, which does not half bring home the dangers. I look at them and think, “Gosh, I hope there is security, because they are just little kids.” I worry about that kind of thing. As everyone has said, keeping children safe is ever more important.
The Bill’s progress shows some of the best aspects of this place and the other place working together to improve legislation. The shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), and the hon. Member for Aberdeen North (Kirsty Blackman) both mentioned that, and it has been encouraging to see how the Bill has come together. However, as others have said, it has taken a long time and there have been a lot of delays. Perhaps that was unavoidable, but it is regrettable. It has been difficult for the Government to get the Bill to where it is today, and the trouble is that the delays mean there will probably be more victims before the Bill is enacted. We see before us a much-changed Bill, and I thank the Lords for their 150 amendments. They have put in a lot of hard work, as others have said.
The Secretary of State’s powers worry my party and me, and I wonder whether the Bill still fails to tackle harmful activity effectively. Perhaps better things could be done, but we are where we are. I welcome the addition of new offences, such as encouraging self-harm and intimate image abuse. A future Bill might be needed to set out the thresholds for the prosecution of non-fatal self-harm. We may also need further work on the intent requirement for cyber-flashing, and on whether Ofcom can introduce such requirements. I am encouraged by what we have heard from the Minister.
We would also have liked to see more movement on risk assessment, as terms of service should be subject to a mandatory risk assessment. My party remains unconvinced that we have got to grips with the metaverse—this terrifying new thing that has come at us. I think there is work to be done on that, and we will see what happens in the future.
As others have said, education is crucial. I hope that my grandchildren, sitting there with their iPads, have been told as much as possible by their teachers, my daughter and my son-in-law about what to do and what not to do. That leads me on to the huge importance of the parent being able, where necessary, to intervene rapidly, because this has to be done damned quickly. If it looks like they are going down a black hole, we want to stop that right away. A kid could see something horrid that could damage them for life—it could be that bad.
Once a child sees something, they cannot unsee it. This is not just about parental controls; we hope that the requirement on the companies to do the risk assessments and on Ofcom to look at those will mean that those issues are stopped before they even get to the point of requiring parental controls. I hope that such an approach will make this safer by design when it begins to operate, rather than relying on having an active parent who is not working three jobs and therefore has time to moderate what their children are doing online.
The hon. Lady makes an excellent point. Let me just illustrate it by saying that each of us in our childhood, when we were little—when we were four, five or six—saw something that frightened us. Oddly enough, we never forget that throughout the rest of life, do we? That is what bad dreams are made of. We should remember that point, which is why those are wise words indeed.