Online Safety Act: Implementation

Kirsty Blackman Excerpts
Wednesday 26th February 2025

(1 day, 16 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I thank you for chairing this debate, Mr Stringer, and I congratulate the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on bringing this debate to Westminster Hall. It is a subject we have talked about many times.

I want to make a number of points. The first is about safety by design. Page 1 of the Act states that the internet should be “safe by design”, yet everything that has happened since in the Act’s implementation, from the point of view of both Ofcom and the Government in respect of some of the secondary legislation, has not been about safety by design. It has been about regulating specific content, for example, and that is not where we should be. Much as I was happy that the Online Safety Act was passed, and I was worried about the perfect being the enemy of the good and all that, I am beginning to believe that the EU’s Digital Services Act will do a much better job of regulating, not least because the Government are failing to take enough action on this issue.

I am concerned that Ofcom, in collaboration with the Government, has managed to get us to a situation that makes nobody happy. It is not helpful for some of the tech companies. For example, category 1 is based solely on user numbers, which means that suicide forums, eating disorder platforms, doxing platforms and livestreaming platforms where self-generated child sexual abuse material is created are subject to exactly the same rules as a hill walking forum that gets three posts a week. In terms of proportionality, Ofcom is also failing the smallest platforms that are not risky, by requiring them to come to a three-day seminar on how to comply, when they might be run by a handful of volunteers spending a couple of hours a week looking after the forum and moderating every post. It will be very difficult for them to prove that children do not use their platforms, so there is no proportionality at either end of the spectrum.

In terms of where we are with the review, this is a very different Parliament from the one that began the conversations in the Joint Committee on the Draft Online Safety Bill. It felt like hardly anybody in these rooms knew anything about the online world or had any understanding of it. It is totally different now. There are so many MPs here who, for example, have an employment history of working hard to make improvements in this area. As the right hon. and learned Member said, we now have so much expertise in these rooms that we could act to ensure that the legislation worked properly. Rather than us constantly having to call these debates, the Government could rely on some of our expertise. They would not have to take on every one of a Joint Committee’s recommendations, for example, but they could rely on some of the expertise and the links that we have made over the years that we have been embedded in this area to help them make good decisions and ensure some level of safety by design.

Like so many Members in this place, I am concerned that the Act will not do what it is supposed to do. For me, the key thing was always keeping children safe online, whether that is about the commitments regularly given by the Government, which I wholeheartedly believe they wanted to fulfil, about hash matching to identify grooming behaviours, or about the doxing forums or suicide forums—those dark places of the internet—which will be subject to exactly the same rules as a hill walking forum. They are just going to fill in a risk assessment and say, “No children use our platform. There’s no risk on our platform, so it’s all good.” The Government had an opportunity to categorise them and they choose not to. I urge them to change their mind.

Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Kirsty Blackman Excerpts
Tuesday 4th February 2025

(3 weeks, 2 days ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Martin Wrigley Portrait Martin Wrigley (Newton Abbot) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Sir Christopher. I am disappointed in this statutory instrument. I recognise the Minister’s acknowledgment of the small sites, high-harm issue, but the issue is far more important and we are missing an opportunity here. Can the Minister set out why the regulations as drafted do not follow the will of Parliament, accepted by the previous Government and written into the Act, that thresholds for categorisation can be based on risk or size? That was a long-argued point that went through many iterations.

The then Minister accepted the amendment that was put forward and said:

“many in the House have steadfastly campaigned on the issue of small but risky platforms.” —[Official Report, 12 September 2023; Vol. 737, c. 806.]

He confirmed that the legislation would now give the Secretary of State the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors, with the change ensuring that the framework was as flexible as possible in responding to the risk landscape. That has been thrown away in this new legislation. The Minister just said that we must do everything in our power, and yet the Government are throwing out a crucial change made to the Act to actually give them more power. They are getting rid of a power by changing this.

The amendment was to ensure that small sites dedicated to harm, such as sites providing information on suicide or self-harm or set up to target abuse and hatred at minority groups, like we saw in the riots in the summer, were subject to the fullest range of duties. When Ofcom published its advice, however, it disregarded this flexibility and advised that regulation should be laid bringing only the large platforms into category 1.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Is the hon. Member as concerned as I am that the Government seem to be ignoring the will of Parliament in their decision? Is he worried that young people particularly will suffer as a result?

Martin Wrigley Portrait Martin Wrigley
- Hansard - - - Excerpts

Absolutely—I am. The Secretary of State’s decision to proceed with this narrow interpretation of the Online Safety Act provisions, and the failure to use the power they have to reject Ofcom’s imperfect advice, will allow small, risky platforms to continue to operate without the most stringent regulatory restrictions available. That leaves significant numbers of vulnerable users—women and individuals from minority groups—at risk of serious harm from targeted activity on these platforms.

I will set a few more questions for the Minister. How do His Majesty’s Government intend to assess whether Ofcom’s regulatory approach to small but high-harm sites is proving effective, and have any details been provided on Ofcom’s schedule of research about such sites? What assessment have the Government made of the different harms occurring on small, high-harm platforms? Have they broken this down by type of harm, and will they make such information available? Have the Government received legal advice about the use of service disruption orders for small but high-harm sites? Do the Government expect Ofcom to take enforcement action against small but high-harm sites, and have they made an assessment of the likely timescales for enforcement action? Will the Government set out criteria against which they expect Ofcom to keep its approach to small but high-harm sites under continual review, as set out in their draft statement of strategic priorities for online safety?

Was the Minister aware of the previous Government’s commitment that Select Committees in both Houses would be given the opportunity to scrutinise draft Online Safety Act statutory instruments before they were laid? If she was, why did that not happen in this case? Will she put on record her assurances that Online Safety Act statutory instruments will in future be shared with the relevant Committees before they are laid?

For all those reasons, I will vote against the motion.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate the opportunity to speak in this Committee, Sir Christopher. Like at least one other Member in the room, I lived the Online Safety Bill for a significant number of months—in fact, it seemed to drag on for years.

As the Minister said, the Online Safety Act is long overdue. We have needed this legislation for 30 years, since I was a kid using the internet in the early ’90s. There has always been the risk of harm on online platforms, and there have always been places where people can be radicalised and can see misogynistic content or content that children should never be able to see. In this case, legislation has moved significantly slower than society—I completely agree with the Minister about that—but that is not a reason for accepting the statutory instrument or agreeing with the proposed threshold conditions.

On the threshold conditions, I am unclear as to why the Government have chosen 34 million and 7 million for the average monthly active users. Is it 34 million because Reddit happens to have 35 million average UK users—is that why they have taken that decision? I absolutely believe that Reddit should be in scope of category 1, and I am pretty sure that Reddit believes it should be in scope of category 1 and have those additional duties. Reddit is one of the places where the functionalities and content recommendation services mean that people, no matter what age they are, can see incredibly harmful content. They can also see content that can be incredibly funny—a number of brilliant places on Reddit allow people can look at pictures of cats, which is my favourite way to use the internet—but there are dark places in Reddit forums, where people can end up going down rabbit holes. I therefore agree that platforms such as Reddit should be in scope of category 1.

The Minister spoke about schedule 11 and the changes that were made during the passage of the Act. The Minister is absolutely right. Paragraph 1(5) of that schedule states:

“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”

However, that does not undo the fact that we as legislators made a change to an earlier provision in that schedule. We fought for that incredibly hard and at every opportunity—in the Bill Committee, on the Floor of the House, in the recommitted Committee and in the House of Lords. At every stage, we voted for that change to be made, and significant numbers of outside organisations cared deeply about it. We wanted small high-risk platforms to be included. The provision that was added meant that the Secretary of State must make regulations relating to

“any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”

That was what the Government were willing to give us. It was not the original amendment that I moved in Bill Committee, which was specifically about small high-risk platforms, but it was enough to cover what we wanted.

What functionalities could and should be brought in scope? I believe that any service that allows users to livestream should be in the scope of category 1. We know that livestreaming is where the biggest increase in self-generated child sexual abuse material is. We know that livestreaming is incredibly dangerous, as people who are desperate to get access to child sexual abuse material can convince vulnerable young people and children to livestream. There is no delay where that content can be looked at and checked in advance of it being put up, yet the Government do not believe that every service that allows six-year-olds to livestream should be within the scope of category 1. The Government do not believe that those services should be subject to those additional safety duties, despite the fact that section 1 of the Online Safety Act 2023 says platforms should be “safe by design”. However, this is not creating platforms that are safe by design.

The regulations do not exclude young people from the ability to stream explicit videos to anyone because they only include services with over 34 million users, or over 7 million when it comes to content recommendation, and I agree that services in those cases are problematic. However, there are other really problematic services, causing life-changing—or in some cases, life-ending—problems for children, young people and vulnerable adults that will not be in the scope of category 1.

Generally, I am not a big fan of a lot of things that the UK Government have done; I have been on my feet, in the Chamber, arguing against a significant number of those things. This is one of the things that makes me most angry, because the Government, by putting forward this secondary legislation, are legislating in opposition to the will and intention of the Houses of Parliament. I know that we cannot bind a future Government or House, but this is not what was intended or agreed and moved on, nor what Royal Assent was given on; that was on the basis that we had assurances from Government Ministers that they would look at those functionalities and small but high-risk platforms.

For what Ofcom has put out in guidance and information on what it is doing on small but high-risk platforms, why are we not using everything that is available? Why are Government not willing to use everything available to them to bring those very high-risk platforms into the scope of category 1?

The changes that category 1 services would be required to make include additional duties; for a start, they are under more scrutiny—which is to be expected—and they are put on a specific list of category 1 services which will be published. That list of category 1 services includes platforms such as 4chan, that some people may have never heard of. Responsible parents will see that list and say, “Hold on a second. Why is 4chan on there? I don’t want my children to be going on there. It is clearly not a ginormous platform, therefore it must be on there because it is a high-risk service.” Parents will look at that list and talk to their children about those platforms. In terms of the category 1 list, never mind the additional duties, that would have a positive impact. Putting suicide forums on that list of category 1 services would have a positive impact on the behaviour of parents, children, and the teachers who teach those young people how to access the internet safely.

I guarantee that a significant number of teachers and people that are involved with young people have never heard of 4chan, but putting it on that list would give them an additional tool to enable them to approach young people and talk about the ways in which they use the internet.

Danny Chambers Portrait Dr Danny Chambers (Winchester) (LD)
- Hansard - - - Excerpts

I thank the hon. Lady for speaking so passionately on this matter. As the Liberal Democrat mental health spokesperson, something that we are increasingly coming across is that it is not just adults asking children to livestream, but children, peer-to-peer, who do not realise that it is illegal. As the hon. Lady touched on, the mental health impact is huge but also lifelong. Someone can have a digital footprint that they can never get rid of, and children who are uninformed and uneducated to the impacts of their decisions could be affected decades into the future.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. That is an additional reason why livestreaming is one of my biggest concerns. That functionality should have been included as a matter of course. Any of the organisations that deal with young people and the removal of child sexual abuse material online, such as the Internet Watch Foundation, will tell you that livestreaming is a huge concern. The hon. Member is 100% correct.

That is the way I talk to my children about online safety: once something is put online—once it is on the internet—it cannot ever be taken back. It is there forever, no matter what anyone does about it, and young people may not have the capacity to understand that. If systems were safe by design, young people simply would not have access to livestreaming at all; they would not have access to that functionality, so there would be that moment of thinking before they do something. They would not be able to do peer-to-peer livestreaming that can then be shared among the entire school and the entire world.

We know from research that a significant number of child sexual abuse materials are impossible to take down. Young people may put their own images online or somebody else may share them without their consent. Organisations such as the Internet Watch Foundation do everything they can to try to take down that content, but it is like playing whack-a-mole; it comes up and up and up. Once they have fallen into that trap, the content cannot be taken back. If we were being safe by design, we would ensure, as far as possible—as far as the Government could do, we could do or Ofcom could do—that no young person would be able to access that functionality. As I said, it should have been included.

I appreciate what the Government said about content recommendation and the algorithms that are used to ensure that people stay on platforms for a significant length of time. I do not know how many Members have spent much time on TikTok, but people can start watching videos of cats and still be there an hour and a half later. The algorithms are there to try to keep us on the platform. They are there because, actually, the platforms make money from our seeing the advertisements. They want us to see exciting content. Part of the issue with the content recommendation referenced in the conditions is that platforms are serving more and more exciting and extreme content to try to keep us there for longer, so we end up with people being radicalised on these platforms—possibly not intentionally by the platforms, but because their algorithm serves more and more extreme content.

I agree that that content should have the lower threshold in terms of the number of users. I am not sure about the numbers of the thresholds, but I think the Government have that differentiation correct, particularly on the addictive nature of algorithmic content. However, they are failing on incredibly high-risk content. The additional duties for category 1 services involve a number of different things: illegal content risk assessments, duties relating to terms of service, children’s risk assessments, adult empowerment duties and record-keeping duties. As I said, the fact that those category 1-ranked platforms will be on a list is powerful in itself, but adding those additional duties is really important.

Let us say that somebody is undertaking a risky business—piercing, for example. Even though not many people get piercings in the grand scheme of things, the Government require piercing organisations to jump through additional hoops because they are involved in dangerous things that carry a risk of infection and other associated risks. They are required to meet hygiene regulations, register with environmental health and have checks of their records to ensure that they know who is being provided with piercings, because it is a risky thing. The Government are putting additional duties on them because they recognise that piercing is risky and potentially harmful.

However, the Government are choosing not to put additional duties on incredibly high-risk platforms. They are choosing not to do that. They have been given the right to do that. Parliament has made its will very clear: “We want the Government to take action over those small high-risk platforms.” I do not care how many hoops 4chan has to jump through. Give it as many hoops as possible; it is an incredibly harmful site, and there are many others out there—hon. Members mentioned suicide forums, for example. Make them jump through every single hoop. If we cannot ban them outright—which would be my preferred option—make them keep records, make them have adult-empowerment duties, and put them on a list of organisations that we, the Government or Ofcom reckon are harmful.

If we end up in a situation where, due to the failures of this Act, young people commit suicide, and the platform is not categorised properly, there is then a reduction in the amount of protections, and in the information that they have to provide about deceased children to the families, because they are not categorised as category 1 or 2B. We could end up in a situation where a young person dies as a result of being radicalised on a forum—because the Government decided it should not be in scope—but that platform does not even have to provide the deceased child’s family with access to that online usage. That is shocking, right? If the Government are not willing to take the proper action required, at least bring these platforms into the scope of the actions and requirements related to deceased children.

I appreciate that I have taken a significant length of time—although not nearly as long as the Online Safety Act has taken to pass, I hasten to say—but I am absolutely serious about the fact that I am really, really angry about this. This is endangering children. This is endangering young people. This is turning the Online Safety Act back into what some people suggested it should be at the beginning, an anti-Facebook and anti-Twitter Act, or a regulation of Facebook and Twitter— or X—Act, rather than something that genuinely creates what it says in section 1 of the Act: an online world that is “safe by design”.

This is not creating an online world that is safe by design; this is opening young people and vulnerable adults up to far more risks than it should. The Government are wilfully making this choice, and we are giving them the opportunity to undo this and to choose to make the right decision—the decision that Parliament has asked them to make—to include functionalities such as livestreaming, and to include those high-risk platforms that we know radicalise people and put them at a higher risk of death.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I thank all Members for their very powerful contributions to the debate. This instrument will bring us one step closer to a safer online world for our citizens. It is clearer than ever that it is desperately needed: transparency, accountability and user empowerment matter now more than ever.

The Opposition spokesperson, the hon. Member for Huntingdon, asked whether we agree on the need for companies not to wait for the duties in the Act to be implemented, but to ensure that safety is baked in from the start. I absolutely agree, and he will be aware that the Secretary of State has made that point on many occasions. He also raised the issue of proportionality. I confirm that many of the duties on categorised services are subject to the principle of proportionality, which requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity, and in some cases duties are based on the assessment of risk of harm presented by the service.

For example, in determining what is proportionate for the user empowerment duties on content for category 1 services, the findings of the most recent user empowerment assessments are relevant. They include the incidence of relevant content on the service in addition to the size and capacity of the provider. Where a code of practice is relevant to a duty, Ofcom must have regard to the principles on proportionality, and what is proportionate for one kind of service might not be for another.

The hon. Member for Huntingdon is absolutely right that the pornography review has been completed. The Government are reviewing that at the moment and will publish it in due course.

In response to the hon. Members for Newton Abbot and for Aberdeen North (Kirsty Blackman) and to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), when the Online Safety Act was introduced, category 1 thresholds were due to be assessed based on the level of risk and harm for adults—as the Members read out very clearly. That was removed during the passage of the Bill by the previous Government.

As things stand, although Baroness Morgan’s successful amendment made it possible for threshold conditions to be based solely on functionalities, it did not change the basis of Ofcom’s research, which for category 1 is easy, quick and wide dissemination of content. The Secretary of State had to consider that. I will repeat that for all Members to hear again: the Secretary of State has to act within the powers given to him in schedule 11 when setting out the threshold and conditions. The powers do not allow for thresholds to be determined by another body, as per the amendment.

Although the hon. Member for Aberdeen North very powerfully read out the Act, it very clearly sets out that it does not actually do what she is asking for it to do. We absolutely agree that small but risky sites need to be covered, but as it stands, the Secretary of State does not have the powers to include them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

Sorry, I have lots of points to cover. If I have not covered the hon Member’s concerns in my response, she is more than welcome to intervene later.

These small but risky services are of significant concern to the Government, and they will still have to protect against illegal content and, where relevant, content that is harmful to children. Ofcom also has a dedicated taskforce to go after them. I hope that answers the hon. Member’s question.

The hon. Member for Newton Abbot also raised the review of Ofcom’s approach. The regulator has already trialled an approach of targeting small but risky services through its regulation of video-sharing platforms. Indeed, a number of those services improved their policies and content moderation in response. All the adult platforms under the VSP regime, large and small, have implemented age verification through this route to ensure that under-18s cannot access pornography on their services. In instances where services fail to make necessary changes, they will face formal enforcement action from Ofcom. Ofcom has a proven track record and the Government have every faith in its ability to take action against non-compliant services.

The hon. Member also raised issues around how Ofcom will enforce action against small but risky services. Ofcom will have robust enforcement powers available to use against companies that fail to fulfil their duties and it will be able to issue enforcement decisions. Action can include fines of up to £18 million or 10% of qualifying worldwide revenue in the relevant year, whichever is higher, and Ofcom can direct companies to take specific steps to comply with its regulation.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I am going to make some progress. On livestreaming, Ofcom considered that functionality, but concluded that the key functionalities that spread content easily, quickly and widely are content recommender systems and forwarding or resharing user-generated content.

Services accessed by children must still be safe by design, regardless of whether they are categorised. Small but risky services will also still be required to comply with illegal content duties. The hon. Member for Aberdeen North should be well aware of that as she raised concerns on that issue.

On child safety, there were questions about how online safety protects children from harmful content. The Act requires all services in scope to proactively remove and prevent users from being exposed to priority illegal content, such as illegal suicide content and child sexual exploitation and abuse material. That is already within the remit.

In addition, companies that are likely to be accessed by children will need to take steps to protect children from harmful content and behaviour on their services, including content that is legal but none the less presents a risk of harm to children. The Act designates content that promotes suicide or self-harm as in the category of primary priority content that is harmful to children. Parents and children will also be able to report pro-suicide or pro-self-harm content to the platform and the reporting mechanism will need to be easy to navigate for child users. On 8 May, Ofcom published its draft children’s safety codes of conduct, in which it proposed measures that companies should employ to protect children from suicide and self-harm content, as well as other content.

Finally, on why category 1 is not based on risk, such as the risk of hate speech, when the Act was introduced, category 1 thresholds were due to be assessed on the level of risk of harm to adults from priority content disseminated by means of that service. As I said earlier, that was removed during the Act’s passage by the then Government and replaced with consideration of the likely functionalities and how easily, quickly and widely user-generated content is disseminated, which is a significant change. Although the Government understand that that approach has its critics, who argue that the risk of harm is the most significant factor, that is the position under the Act.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister is making the case that the Secretary of State’s hands are tied by the Act —that it requires stuff in relation to the number of users. Can she tell us in which part of the Act it says that, because it does not say that? If she can tell us where it is in the Act, I am quite willing to sit down and shut up about this point, but it is not in the Act.

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.

The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.

I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Christopher—I appreciate that prod. I did look at Standing Orders this morning, but could not find that bit, so that is incredibly helpful.

On what the Minister said about schedule 11 and the notes that she has been passed from her team on that point, I appreciate her commitment to share the Government’s legal advice. That will be incredibly helpful; it would have been helpful to have it in advance of this Committee.

In schedule 11, it says:

“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”

Perhaps I cannot read English, or perhaps the Minister, her legal advisers and the team at DSIT read it in a different way from me, but the Secretary of State having to take something into account and the Secretary of State being bound by something are two different things—they are not the same. It does not say that the Secretary of State must regulate only on the specific number of users.

In fact, schedule 11 says earlier that the Secretary of State

“must make regulations specifying conditions…for the user-to-user part of regulated user-to-user services relating to each of the following”,

which are the

“number of users…functionalities of that part of the service, and…any other characteristics of that part of the service or factors”.

The Secretary of State must therefore make regulations in relation to any other characteristics of that part of the service or factors

“relating to that part of the service that the Secretary of State considers relevant.”

He must do that, but he must only take into account the number of users. The Government, however, have decided that taking into account is much more important than “must” do something. They have decided to do that despite Parliament being pretty clear in the language it has used.

I am not terribly happy with the Online Safety Act. It is a lot better than the situation we have currently, but it is far from perfect. As the Minister said, I argued in favour of keeping the stuff about legal but harmful content for adults. I argued against the then Government’s position on that, but the Act is the Act that we have.

The Minister’s point does not make sense. The Secretary of State has to take into account the number of users and how quickly things are disseminated, but he must make regulations about functionalities or factors that he considers relevant. Therefore, it seems that he does not consider suicide forums and livestreaming to be relevant; if he did, he would surely be bound by the “must” and would have to make regulations about them. It is frustrating that the Act does not do what it is supposed to do and does not protect young people from livestreaming. The Minister said that it protects people from seeing that illegal content, but it does not prevent them from creating it.

The Government could make regulations so that every platform that has a livestreaming functionality, or even every platform that has child users on it—there is a lot in the Act about the proportion of children who use a service—is automatically included in category 1 because they consider them to be high risk.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

It would not be right for either of us to ask the Minister to disclose legal advice—that clearly would not be appropriate—but I am grateful for the Minister’s offer to share a slightly more expansive description of why the Government have come to the conclusion that they have.

On the hon. Lady’s point about what the Act actually says, we have both quoted paragraph 1(5) of schedule 11, which deals with whether the language that has found its way into the ministerial statement is the be-all and end-all of the Minister’s conclusions. We both think it is not. If it is the case, as I think the Minister is arguing, that the ability to disseminate “easily, quickly and widely” is essentially a synonym for the scale of the service and the number of its users, what does the hon. Lady think of the amendment that Baroness Morgan made in the other place to paragraph 1(4), which says that when the regulations we are considering specify

“the way or ways in which the relevant conditions are met”,

for category 1 threshold conditions

“at least one specified condition about number of users or functionality must be met”?

The crucial word that was added is “or”. If the number of users were required to establish what the hon. Lady has described, the word “or” would be inappropriate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree, and that is a helpful clarification.

If the Government have decided that it is too difficult to regulate high-risk platforms as category 1, and that they do not matter enough because they do not have enough of an impact, they should stand up and tell us that. Rather than saying that their hands have been tied by the Act—they manifestly have not—they need to take ownership of their actions. If they have decided that such platforms are not important enough or that they cannot be bothered having a fight with Ofcom about that, they should be honest and say, “This is the position we have decided to take.” Instead, they are standing up and saying, “Our hands have been tied,” but that is just not correct: their hands have not been tied by the Act.

I appreciate that the Minister will get in touch with me about the legal advice, but it will be too late. This statutory instrument will have been through the process by that time, and people will have been put at risk as a result of the Government’s failure. They have the power to take action in relation to functionalities and factors, and in relation to suicide forums, livestreaming and the creation of child sexual abuse material, and they are choosing not to.

If the Government have decided that it is too difficult to do that, that those platforms are not risky enough and that not enough people are being harmed by them, they need to hold their hands up and say, “We’ve decided that this is the position we are going to take.” They must not hide behind the legislation, which does not say what they are telling us it says. They should just be honest about the fact that they have decided that they cannot be bothered to take action. They cannot be bothered to have a fight with Ofcom because it is not important enough. Hiding behind the legislation is incredibly cowardly—it does not say that.

AstraZeneca

Kirsty Blackman Excerpts
Monday 3rd February 2025

(3 weeks, 3 days ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

Not only have we set aside £520 million precisely to be able to invest in the life sciences industry with an innovation fund, we are very keen to work with specific businesses to understand how they can make more secure, long-term investment. The single most important thing for most people making an investment in the UK is whether they believe there is political, fiscal and financial stability in the UK. That is what we are absolutely determined to deliver. My hon. Friend makes a very good point about those who are immunosuppressed for all sorts of different reasons, whether their medication or a condition. I will take that point back to the Department.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

The Chancellor said that economic growth is the most important thing and this was an opportunity to get some of that economic growth. This was an opportunity to get something over the line and the UK Government failed to deliver it. How can the House and the public trust anything the UK Government say? How can they say that this is the founding mission if they then fail to deliver for a region that could really do with that economic growth?

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

The thing is that spending taxpayers’ money has to be proven to be good value for money. That is why, whenever we are making an investment such as this, we have to make sure it delivers more return on investment than £1 for £1. When AstraZeneca made the decision to cut the R&D part of its budget from £150 million to £90 million, it made sense for the UK Government to look again at the amount of money we could legitimately put in on behalf of the taxpayer. If the hon. Lady had been in my place, I think she would have made exactly the same decision.

Online Safety: Children and Young People

Kirsty Blackman Excerpts
Tuesday 26th November 2024

(3 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I thank my hon. Friend for raising the really important—indeed, deeply concerning—issue of the rise of anti-women hate, with the perpetrators marketing themselves as successful men.

What we are seeing is that boys look at such videos and do not agree with everything that is said, but little nuggets make sense to them. For me, it is about the relentless bombardment: if someone sees one video like that, they might think, “Oh right,” and not look at it properly, but they are relentlessly targeted by the same messaging over and over again.

That is true not just for misogynistic hate speech, but for body image material. Girls and boys are seeing unrealistic expectations of body image, which are often completely fake and contain fake messaging, but which make them reflect on their own bodies in a negative way, when they may not have had those thoughts before.

I want to drive home that being 14 years old is tough. I am really old now compared with being 14, but I can truly say to anybody who is aged 14 watching this: “It gets better!” It is hard to be a 14-year-old: they are exploring their body and exploring new challenges. Their hormones are going wild and their peers are going through exactly the same thing. It is tough, and school is tough. It is natural for children and young people to question their identity, their role in the world, their sexuality, or whatever it is they might be exploring—that is normal—but I am concerned that that bombardment of unhealthy, unregulated and toxic messaging at a crucial time, when teenagers’ brains are developing, is frankly leading to a crisis.

I return to an earlier point about whether the parts of apps or platforms that children are using are actually safe for them to use. There are different parts of apps that we all use—we may not all be tech-savvy, but we do use them—but when we drill into them and take a minute to ask, “Is this safe for children?”, the answer for me is, “No.”

There are features such as the live location functionality, which comes up a lot on apps, such as when someone is using a maps app and it asks for their live location so they can see how to get from A to B. That is totally fine, but there are certain social media apps that children use that have their live location on permanently. They can toggle it to turn it off, but when I asked children in Darlington why they did not turn it off, they said there is a peer pressure to keep it on—it is seen as really uncool to turn it off. It is also about being able to see whether someone has read a message or not.

I then said to those children, “Okay, but those apps are safe because you only accept people you know,” and they said, “Oh no, I’ve got thousands and thousands of people on that app, and it takes me ages to remove each person, because I can’t remember if I know them, so I don’t do it.” They just leave their location on for thousands of people, many of whom may be void accounts, and they do not even know if they are active any more. The point is that we would not allow our children to go into a space where their location was shown to lots of strangers all the time. Those children who I spoke to also said that the live location feature on some of these apps is leading to in-person bullying and attacks. That is absolutely horrifying.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

On that point, is the hon. Member aware that if someone toggles their location off on Snapchat, for example, it constantly—in fact, every time the app is opened—says, “You’re on ghost mode. Do you want to turn your location back on?” So every single time someone opens the app, it tries to convince them to turn their location back on.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I thank the hon. Member for raising that issue, because there are lots of different nudge notifications. We can understand why, because it is an unregulated space and the app is trying to get as much data as possible—if we are not paying for the service, we are the service. We all know that as adults, but the young people and children who we are talking about today do not know that their data is what makes them attractive to that app.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I could talk for hours on this subject, Mr Dowd, but, do not worry, I will not. There are a number of things that I would like to say. Not many Members present sat through the majority of the Online Safety Bill Committee as it went through Parliament, but I was in every one of those meetings, listening to various views and debating online safety.

I will touch on one issue that the hon. Member for Darlington (Lola McEvoy) raised in her excellent and important speech. I agree with almost everything she said. Not many people in Parliament have her level of passion or knowledge about the subject, so I appreciate her bringing forward the debate.

On the issue of features, I totally agree with the hon. Member and I moved an amendment to that effect during the Bill’s progress. There should be restrictions on the features that children should be able to access. She was talking about safety by design, so that children do not have to see content that they cannot unsee, do not have to experience the issues that they cannot un-experience, cannot be contacted by external people who they do not know, and cannot livestream. We have seen an increase in the amount of self-generated child sexual abuse material and livestreaming is a massive proportion of that.

Yesterday, a local organisation in Aberdeen called CyberSafe Scotland launched a report on its work in 10 of our primary schools with 1,300 children aged between 10 and 12—primary school children, not secondary school children. Some 300 of those children wrote what is called a “name it”, where they named a problem that they had seen online. Last night, we were able to read some of the issues that they had raised. Pervasive misogyny is everywhere online, and it is normalised. It is not just in some of the videos that they see and it is not just about the Andrew Tates of this world—it is absolutely everywhere. A couple of years ago there was a trend in online videos of young men asking girls to behave like slaves, and that was all over the place.

Children are seeing a different online world from the one that we experience because they have different algorithms and have different things pushed at them. They are playing Roblox and Fortnite, but most of us are not playing those games. I am still concerned that the Online Safety Act does not adequately cover all of the online gaming world, which is where children are spending a significant proportion of their time online.

A huge amount more needs to be done to ensure that children are safe online. There is not enough in place about reviewing the online safety legislation, which Members on both sides of the House pushed for to ensure that the legislation is kept as up to date as possible. The online world changes very rapidly: the scams that were happening nine months ago are totally different from those happening today. I am still concerned that the Act focuses too much on the regulation of Facebook, for example, rather than the regulation of the online world that our children actually experience. CyberSafe Scotland intentionally centred the views and rights of young people in its work, which meant that the programmes that it delivered in schools were much more appropriate and children were much better able to listen and react to them.

The last thing that I will mention is Girlguiding and its girls’ attitude survey. It is published on an annual basis and shows a huge increase in the number of girls who feel unsafe. That is because of the online world they are experiencing. We have a huge amount of responsibility here, and I appreciate the hon. Member for Darlington bringing the debate forward today.

Peter Dowd Portrait Peter Dowd (in the Chair)
- Hansard - - - Excerpts

I will keep this to an informal four-minute limit. Regrettably, if Members speak beyond that, I will have to introduce a formal figure.

AI Seoul Summit

Kirsty Blackman Excerpts
Thursday 23rd May 2024

(9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Saqib Bhatti Portrait Saqib Bhatti
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. Friend. We recognise the risks and opportunities that AI presents. That is why we have tried to balance safety and innovation. I refer him to the Online Safety Act 2023, which is a technology agnostic piece of legislation. AI is covered by a range of spheres where the Act looks at illegal harms, so to speak. He is right to say that this is about helping humanity to move forward. It is absolutely right that we should be conscious of the risks, but I am also keen to support our start-ups, our innovative companies and our exciting tech economy to do what they do best and move society forward. That is why we have taken this pro-safety, pro-innovation approach; I repeat that safety in this field is an enabler of growth.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

I would like to thank Sir Roger Gale, who has just left the Chair. He has been excellent in the Chair and I have very much enjoyed his company as well as his chairing.

I thank the Government for advance sight of the statement. My constituents and people across these islands are concerned about the increasing use of AI, not least because of the lack of regulation in place around it. I have specific questions in relation to the declarations and what is potentially coming down the line with regulation.

Who will own the data that is gathered? Who has responsibility for ensuring its safety? What is the Minister doing to ensure that regard is given to copyright and that intellectual property is protected for those people who have spent their time and energy and massive talents in creating information, research and artwork? What are the impacts of the use of AI on climate change? For example, it has been made clear that using this technology has an impact on the climate because of the intensive amounts of electricity that it uses. Are the Government considering that?

Will the Minister ensure that in any regulations that come forward there is a specific mention of AI harms for women and girls, particularly when it comes to deepfakes, and that they and other groups protected by the Equality Act 2010 are explicitly mentioned in any regulations or laws that come forward around AI? Lastly, we waited 30 years for an Online Safety Act. It took a very long time for us to get to the point of having regulation for online safety. Can the Minister make a commitment today that we will not have to wait so long for regulations, rather than declarations, in relation to AI?

Saqib Bhatti Portrait Saqib Bhatti
- View Speech - Hansard - - - Excerpts

The hon. Lady makes some interesting points. The thing about AI is not just the large language models, but the speed and power of the computer systems and the processing power behind them. She talks about climate change and other significant issues we face as humanity; that power to compute will be hugely important in predicting how climate change evolves and weather systems change. I am confident that AI will play a huge part in that.

AI does not recognise borders. That is why the international collaboration and these summits are so important. In Bletchley we had 28 countries, plus the European Union, sign the declaration. We had really good attendance at the Seoul summit as well, with some really world-leading declarations that will absolutely be important.

I refer the hon. Lady to my earlier comments around copyright. I recognise the issue is important because it is core to building trust in AI, and we will look at that. She will understand that I will not be making a commitment at the Dispatch Box today, for a number of reasons, but I am confident that we will get there. That is why our approach in the White Paper response has been well received by the tech industry and AI.

The hon. Lady started with a point about how constituents across the United Kingdom are worried about AI. That is why we all have to show leadership and reassure people that we are making advances on AI and doing it safely. That is why our AI Safety Institute was so important, and why the network of AI safety institutes that we have helped to advise on and worked with other countries on will be so important. In different countries there will be nuances regarding large language models and different things that they will be approaching—and sheer capability will be a huge factor.

Smartphones and Social Media: Children

Kirsty Blackman Excerpts
Tuesday 14th May 2024

(9 months, 1 week ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for chairing this debate, Sir George. I congratulate the hon. Member for Penistone and Stocksbridge (Miriam Cates) on securing it. I want to talk about a number of things: safety online by design, the safety of devices by design, and parental and child education. Just to confuse everyone, I will do that in reverse order, starting off with parental and child education.

Ofcom has a media literacy strategy consultation on the go just now, as well as the consultation on the strategy around protecting children. Both are incredibly important. We have a massive parental knowledge gap. In about 15 or 20 years, this will not be nearly so much of a problem, because parents then will have grown up online. I am from the first generation of parents who grew up online. My kids are 10 and 13. I got the internet at home when I was six or seven, although not in the way that my kids did. Hardly anybody in this House grew up on the internet, and hardly any of the parents of my children’s peers grew up online.

I know very well the dangers there are online, and I am able to talk to my children about them. I have the privilege, the ability and the time to ensure that I know everything about everything they are doing online—whether that means knowing the ins and outs of how Fortnite works, or how any of the games they are playing online work, I am lucky enough to be able to do that. Some parents have neither the time nor the energy nor the capacity to do that.

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I commend the hon. Lady for her knowledge and dedication, but is it not the case that even parents as diligent as her find that teenagers can bypass these controls? Even if our children do not have access to a device, they can easily be shown the most harmful of material on the school bus. Is this not actually about child development, and whether a child has the brain development to be able to use these devices safely, rather than just about education?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I wanted to talk about education among a number of other things. Children can absolutely be shown things on the bus, and stuff like that; children and young people will do what they can to subvert their parents’ authority in all sorts of ways, not just when it comes to mobile phones. Part of the point I was making is that I have the privilege of being able to take those actions, while parents who are working two jobs and are really busy dealing with many other children, for example, may not have the time to do so. We cannot put it all on to parental education, but we cannot put it all on to the education of the children, either. We know that however much information we give children or young people—particularly teenagers—they are still going to make really stupid decisions a lot of the time. I know I made plenty of stupid decisions as a teenager, and I am fairly sure that my children will do exactly the same.

I grew up using message boards, which have now been taken over by Reddit, and MSN Messenger, while kids now use Facebook Messenger or WhatsApp. I grew up using internet relay chat—IRC—and Yahoo! Chat, which have taken over by Discord, and playing Counter-Strike, which has now been subsumed by Fortnite. I used Myspace and Bebo, while kids now use things like Instagram. These things have been around for a very long time. We have needed an online safety Act for more than 20 years. When I was using these things in the ’90s, I was subject to the same issues that my children and other children are seeing today. Just because it was not so widespread does not mean it was not happening, because it absolutely was.

The issue with the Online Safety Act is that it came far too late—I am glad that we have it, but it should have been here 20 years ago. It also does not go far enough; it does not cover all the things that we need it to cover. During the passage of the Act, we talked at length about things like livestreaming, and how children should not be allowed to livestream at all under any circumstance. We could have just banned children from livestreaming and said that all platforms should not allow children to livestream because of the massive increase in the amount of self-generated child sexual abuse images, but the Government chose not to do that.

We have to have safety by design in these apps. We have to ensure that Ofcom is given the powers—which, even with the Online Safety Act, it does not have—to stop platforms allowing these things to happen and effectively ban children from accessing them. Effective age assurance would address some of the problems that the hon. Member for Penistone and Stocksbridge raises. Of course, children will absolutely still try to go around these things, but having that age assurance and age gating, as far as we possibly can—for example, the stuff that Ofcom is doing around pornographic content—will mean that children will not be able to access that content. I do not see that there should be any way for any child to access pornographic content once the Online Safety Act fully comes in, and once Ofcom has the full powers and ability to do that.

The other issue with the Online Safety Act is that it is too slow. There are a lot of consultation procedures and lead-in times. It should have come in far quicker, and then we would have had this protection earlier for our children and young people.

We need to have the safety of devices by design. I am slightly concerned about the number of children who are not lucky enough to get a brand-new phone; the right hon. Member for Chelmsford (Vicky Ford) talked about passing on a phone to a child. Covering that is essential if we are to have safety of devices by design. Online app stores are not effectively covered or as effectively covered as they should be, particularly when it comes to age ratings. I spoke to representatives of an online dating app, who said that they want their app to be 18-plus, but that one of the stores has rated it as 16-plus and they keep asking the store to change it and the store keeps refusing. It is totally ridiculous that we are in that situation. The regulation of app stores is really important, especially when parents will use the app store’s age rating; they will assume that the rating put forward by the app store is roughly correct. We need to make changes in that respect and we need to come down on the app stores, because they are so incredibly powerful. That is a real moment when parents, if they have parental controls, have that ability to make the changes.

In relation to safety online by design, I have already spoken about live streaming. When it comes to gaming, it is entirely possible for children to play online games without using chat functions. Lots of online games do not actually have any chat function at all. Children can play Minecraft without having any chat; they cannot play Roblox without having any effective access to chat. Parents need to understand the difference between Minecraft and Roblox—and not allow anyone to play Roblox, because it is awful.

There are decisions that need to be taken in relation to safety online by design. If people have effective age verification and an effective understanding of the audience for each of these apps and online settings, they can ensure that the rules are in place. I am not convinced yet that Ofcom has enough powers to say what is and what is not safe for children online. I am not convinced that even with the Online Safety Act, there is the flexibility for it to say, “Right—if you have done your child access assessment and you think that your app is likely to be used by children, you cannot have live streaming on the app.” I am not convinced that it has enough teeth to be able to take that action. It does when it comes to illegal content, but when it comes to things that are harmful for children but legal for adults, there is not quite enough strength for the regulator.

I will keep doing what I have been doing in this House, which is saying that the online world can be brilliant—it can be great. Kids can have a brilliant time playing online. They can speak to their friends; particularly if children are isolated or lonely, there are places where they can find fellowship and other people who are feeling the same way. That can be a positive thing. The hon. Member for Penistone and Stocksbridge has laid out where often the online world is negative, but it can be positive too. There are so many benefits in terms of schoolwork, clubs, accessing friends, and calendars. Cameras are great, too. My children sometimes use the bird app on their phones to work out which birds are singing. It is brilliant that they can do things like that online.

There are so many benefits, but we have a responsibility, just as we do when our children are playing in the park, to ensure that they are safe. We have a responsibility as legislators to ensure that the regulators have enough teeth to make sure that the online world is safe, so that children can get the benefits of the online world and of using smartphones but are not subject to the extremely negative outcomes. My hon. Friend the Member for Stirling (Alyn Smith) mentioned his constituent and the awful loss experienced by their family. Children should never, ever have to face that situation online, and we have a responsibility to regulate to ensure that they never have to.

Digital Exclusion

Kirsty Blackman Excerpts
Wednesday 28th February 2024

(11 months, 4 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for your work chairing this debate, Mrs Harris. I congratulate the hon. Member for Ellesmere Port and Neston (Justin Madders) on bringing forward such a popular and important debate.

I will focus my comments on the skills required to access digital. The access issues have been raised, and are incredibly important—I do not want to take away from that. However, on the issues with skills, by 2030, 5 million workers will be acutely under-skilled in basic digital skills. That is a significant number, and it must be a massive concern for the Government.

The skills that people require to access digital must be considered. There is a generational issue: younger people are better at accessing these things. However, that is not true across the board. There is an intersectionality of issues. People are less likely to be able to have digital skills if they are more vulnerable, older, or in poverty, or if they do not have the capacity or time to access them. Given the cost of living crisis, I am increasingly seeing constituents working multiple jobs who just do not have the time to work on their digital skills because they are too busy trying to make ends meet. That is a really big concern for me.

Covid and the roll-out of accessing things online were mentioned. During covid, the Scottish Government provided 72,000 devices and 14,000 internet connections to individuals, children and families that were at risk of being digitally excluded. That has massively increased—the number of devices was up to 280,000 in 2022. We are increasing that as we go in order to ensure that young people are not digitally excluded and are able to spend time typing up documents in Microsoft Word, Google Sheets, or whatever the school prefers them to use when they are at home, because it is so important that digital skills are available for people and that the workforce of the future has digital skills.

Robin Walker Portrait Mr Robin Walker
- Hansard - - - Excerpts

I recognise the good work the Scottish Government, and indeed the English Government —the UK Government—did on getting devices out to people. However, UNESCO highlighted to us, among other things, the cost of devices: having gone out to people, they need to be maintained and their security needs to be upgraded. One of the things we need to think about very carefully in all our Government budgets as we go forward is how to ensure that there is ongoing investment in the digital technologies that are needed both for the people receiving them and those distributing them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I agree. On continual access to the internet, a universal credit social tariff is available for people. Every time I meet with my local jobcentre, I make clear how important it is to stress that the social tariff is available so that people can access that reduced-cost internet access. It is important that we have that and that people know that it exists so that they can take it up.

Within my constituency, I have spoken to Virgin Money, which provides access to internet services. There is also an organisation called Silver Surfers, which provides older people with access to the services and advice they need to access the internet. We have heard about some of the negatives of the internet and some of the positives of online life. It is important to be able to access services online, particularly for people in rural communities who are a long way away from those services. It is important for tackling loneliness to be able to access communities online.

Sarah Dyke Portrait Sarah Dyke
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am really sorry but I will not; I am just going to finish.

As I was saying, it is really important that people can access those things, and like-minded individuals. When my son had Kawasaki disease, it was something that hardly anybody had ever heard of, but I was able to access other parents whose children had been through the same thing to find out how my son’s disease might progress and how things might change—so access to the internet is really important.

Lastly on disenfranchisement, if someone wants to get a voter authority certificate, the main way they can do that is online. It is possible to get a certificate by post, but the process of proving their identity in order to access a certificate—a requirement that the UK Government have brought in—is mainly online. Therefore, people who are disenfranchised and unable to access those services are even more disenfranchised by the fact that the service is mainly online. I encourage the Government to ensure that particularly things like voter authority certificates are as available as possible to people, and that they are not just available online.

Online Safety Bill

Kirsty Blackman Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.

On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.

I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.

There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.

There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

It is a pleasure to speak during what I hope are the final stages of the Bill. Given that nearly all the Bills on which I have spoken up to now have been money Bills, this business of “coming back from the Lords” and scrutinising Lords amendments has not been part of my experience, so if I get anything wrong, I apologise.

Like other Members, I want to begin by thanking a number of people and organisations, including the Mental Health Foundation, Carnegie UK, the Internet Watch Foundation, the National Society for the Prevention of Cruelty to Children and two researchers for the SNP, Aaron Lucas and Josh Simmonds-Upton, for all their work, advice, knowledge and wisdom. I also join the hon. Members for Pontypridd (Alex Davies-Jones) and for Gosport (Dame Caroline Dinenage) in thanking the families involved for the huge amount of time and energy—and the huge amount of themselves—that they have had to pour into the process in order to secure these changes. This is the beginning of the culmination of all their hard work. It will make a difference today, and it will make a difference when the Bill is enacted. Members in all parts of the House will do what we can to continue to scrutinise its operation to ensure that it works as intended, to ensure that children are kept as safe as possible online, and to ensure that Ofcom uses these powers to persuade platforms to provide the information that they will be required to provide following the death of a child about that child’s use of social media.

The Bill is about keeping people safe. It is a different Bill from the one that began its parliamentary journey, I think, more than two years ago. I have seen various Ministers leading from the Dispatch Box during that time, but the voices around the Chamber have been consistent, from the Conservative, Labour and SNP Benches. All the Members who have spoken have agreed that we want the internet to be a safer place. I am extremely glad that the Government have made so many concessions that the Opposition parties called for. I congratulate the hon. Member for Pontypridd on the inclusion of violence against women and girls in the Bill. She championed that in Committee, and I am glad that the Government have made the change.

Another change that the Government have made relates to small high-risk platforms. Back in May or June last year I tabled amendments 80, 81 and 82, which called for that categorisation to be changed so that it was not based just on the number of users. I think it was the hon. Member for Gosport who mentioned 4chan, and I have mentioned Kiwi Farms a number of times in the Chamber. Such organisations cannot be allowed to get away with horrific, vile content that encourages violence. They cannot be allowed a lower bar just because they have a smaller number of users.

The National Risk Register produced by the Cabinet Office—great bedtime reading which I thoroughly recommend—states that both the risk and the likelihood of harm and the number of people on whom it will have an impact should be taken into account before a decision is made. It is therefore entirely sensible for the Government to take into account both the number of users, when it is a significant number, and the extremely high risk of harm caused by some of these providers.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

The hon. Lady is making an excellent speech, but it is critical to understand that this is not just about wickedness that would have taken place anyway but is now taking place on the internet; it is about the internet catalysing and exaggerating that wickedness, and spawning and encouraging all kinds of malevolence. We have a big responsibility in this place to regulate, control and indeed stop this, and the hon. Lady is right to emphasise that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The right hon. Gentleman is entirely correct. Whether it involves a particularly right-wing cause or antisemitism—or, indeed, dieting content that drags people into something more radical in relation to eating disorders—the bubble mentality created by these algorithms massively increases the risk of radicalisation, and we therefore have an increased duty to protect people.

As I have said, I am pleased to see the positive changes that have been made as a result of Opposition pressure and the uncompromising efforts of those in the House of Lords, especially Baroness Kidron, who has been nothing short of tenacious. Throughout the time in which we have been discussing the Bill, I have spoken to Members of both Houses about it, and it has been very unusual to come across anyone who knows what they are talking about, and, in particular, has the incredible depth of knowledge, understanding and wisdom shown by Baroness Kidron. I was able to speak to her as someone who practically grew up on the internet—we had it at home when I was eight—but she knew far more about it than I did. I am extremely pleased that the Government have worked with her to improve the Bill, and have accepted that she has a huge breadth of knowledge. She managed to do what we did not quite manage to do in this House, although hopefully we laid the foundations.

I want to refer to a number of points that were mentioned by the Minister and are also mentioned in the letters that the Government provided relating to the Lords amendments. Algorithmic scrutiny is incredibly important, and I, along with other Members, have raised it a number of times—again, in connection with concern about radicalisation. Some organisations have been doing better things recently. For instance, someone who searches for something may begin to go down a rabbit hole. Some companies are now putting up a flag, for instance a video, suggesting that users are going down a dark hole and should look at something a bit lighter, and directing them away from the autoplaying of the more radical content. If all organisations, or at least a significant number—particularly those with high traffic—can be encouraged to take such action rather than allowing people to be driven to more extreme content, that will be a positive step.

I was pleased to hear about the upcoming researcher access report, and about the report on app stores. I asked a previous Minister about app stores a year or so ago, and the Minister said that they were not included, and that was the end of it. Given the risk that is posed by app stores, the fact that they were not categorised as user-to-user content concerned me greatly. Someone who wants to put something on an Apple app store has to jump through Apple’s hoops. The content is not owned by the app store, and the same applies to some of the material on the PlayStation store. It is owned by the person who created the content, and it is therefore user-to-user content. In some cases, it is created by one individual. There is no ongoing review of that. Age-rating is another issue: app stores choose whatever age they happen to decide is the most important. Some of the dating apps, such as match.com, have been active in that regard, and have made it clear that their platforms are not for under-16s or under-18s, while the app store has rated the content as being for a younger age than the users’ actual age. That is of concern, especially if the companies are trying to improve age-rating.

On the subject of age rating, I am pleased to see more in the Bill about age assurance and the frameworks. I am particularly pleased to see what is going to happen in relation to trying to stop children being able to access pornography. That is incredibly important but it had been missing from the Bill. I understand that Baroness Floella Benjamin has done a huge amount of work on pushing this forward and ensuring that parliamentarians are briefed on it, and I thank her for the work that she has done. Human trafficking has also been included. Again, that was something that we pushed for, and I am glad to see that it has been put on the face of the Bill.

I want to talk briefly about the review mechanisms, then I will go on to talk about end-to-end encryption. I am still concerned that the review mechanisms are not strong enough. We have pushed to have a parliamentary Committee convened, for example, to review this legislation. This is the fastest moving area of life. Things are changing so dramatically. How many people in here had even heard of ChatGPT a year and a half ago? How many people had used a virtual reality headset? How many people had accessed Rec Room of any of the other VR systems? I understand that the Government have genuinely tried their best to make the Bill as future-proof as possible, but we have no parliamentary scrutiny mechanisms written in. I am not trying to undermine the work of the Committee on this—I think it is incredibly important—but Select Committees are busy and they have no legislative power in this regard. If the Government had written in a review, that would have been incredibly helpful.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- Hansard - - - Excerpts

The hon. Lady is making a very good speech. When I first came to this House, which was rather a long time ago now, there was a Companies Act every year, because company law was changing at the time, as was the nature of post-war capitalism. It seems to me that there is a strong argument for an annual Act on the handling and management of the internet. What she is saying is exactly right, and that is probably where we will end up.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely support the right hon. Member’s point—I would love to see this happening on an annual basis. I am sure that the Ministers who have shepherded the Bill through would be terrified of that, and that the Government team sitting over there are probably quaking in their boots at the suggestion, but given how fast this moves, I think that this would be incredibly important.

The Government’s record on post-implementation reviews of legislation is pretty shoddy. If you ask Government Departments what percentage of legislation they have put through a post-implementation review in the timeline they were supposed to, they will say that it is very small. Some Departments are a bit better than others, but given the number of reshuffles there have been, some do not even know which pieces of legislation they are supposed to be post-implementation reviewing. I am concerned that this legislation will get lost, and that there is no legislative back-up to any of the mechanisms for reviewing it. The Minister has said that it will be kept under review, but can we have some sort of governmental commitment that an actual review will take place, and that legislation will be forthcoming if necessary, to ensure that the implementation of this Bill is carried out as intended? We are not necessarily asking the Government to change it; we are just asking them to cover all the things that they intend it to cover.

On end-to-end encryption, on child sexual exploitation and abuse materials, and on the last resort provider—I have been consistent with every Minister I have spoken to across the Dispatch Box and every time I have spoken to hon. Members about this—when there is any use of child sexual exploitation material or child sexual abuse material, we should be able to require the provider to find it. That absolutely trumps privacy. The largest increase in child sexual abuse material is in self-generated content. That is horrific. We are seeing a massive increase in that number. We need providers to be able to search—using the hash numbers that they can categorise images with, or however they want to do it—for people who are sharing this material in order to allow the authorities to arrest them and put them behind bars so that they cannot cause any more harm to children. That is more important than any privacy concerns. Although Ministers have not put it in the Bill until this point, they have, to their credit, been clear that that is more important than any privacy concerns, and that protecting children trumps those concerns when it comes to abuse materials and exploitation. I am glad to see that that is now written into the Bill; it is important that it was not just stated at the Dispatch Box, even though it was mentioned by a number of Members.

--- Later in debate ---
Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- View Speech - Hansard - - - Excerpts

It is very kind of you to call me to speak, Mr Deputy Speaker. I apologise to your good self, to the Minister and to the House for arriving rather tardily.

My daughter and her husband have been staying with me over the past few days. When I get up to make my wife and myself an early-morning cup of tea, I find my two grandchildren sitting in the kitchen with their iPads, which does not half bring home the dangers. I look at them and think, “Gosh, I hope there is security, because they are just little kids.” I worry about that kind of thing. As everyone has said, keeping children safe is ever more important.

The Bill’s progress shows some of the best aspects of this place and the other place working together to improve legislation. The shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), and the hon. Member for Aberdeen North (Kirsty Blackman) both mentioned that, and it has been encouraging to see how the Bill has come together. However, as others have said, it has taken a long time and there have been a lot of delays. Perhaps that was unavoidable, but it is regrettable. It has been difficult for the Government to get the Bill to where it is today, and the trouble is that the delays mean there will probably be more victims before the Bill is enacted. We see before us a much-changed Bill, and I thank the Lords for their 150 amendments. They have put in a lot of hard work, as others have said.

The Secretary of State’s powers worry my party and me, and I wonder whether the Bill still fails to tackle harmful activity effectively. Perhaps better things could be done, but we are where we are. I welcome the addition of new offences, such as encouraging self-harm and intimate image abuse. A future Bill might be needed to set out the thresholds for the prosecution of non-fatal self-harm. We may also need further work on the intent requirement for cyber-flashing, and on whether Ofcom can introduce such requirements. I am encouraged by what we have heard from the Minister.

We would also have liked to see more movement on risk assessment, as terms of service should be subject to a mandatory risk assessment. My party remains unconvinced that we have got to grips with the metaverse—this terrifying new thing that has come at us. I think there is work to be done on that, and we will see what happens in the future.

As others have said, education is crucial. I hope that my grandchildren, sitting there with their iPads, have been told as much as possible by their teachers, my daughter and my son-in-law about what to do and what not to do. That leads me on to the huge importance of the parent being able, where necessary, to intervene rapidly, because this has to be done damned quickly. If it looks like they are going down a black hole, we want to stop that right away. A kid could see something horrid that could damage them for life—it could be that bad.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Once a child sees something, they cannot unsee it. This is not just about parental controls; we hope that the requirement on the companies to do the risk assessments and on Ofcom to look at those will mean that those issues are stopped before they even get to the point of requiring parental controls. I hope that such an approach will make this safer by design when it begins to operate, rather than relying on having an active parent who is not working three jobs and therefore has time to moderate what their children are doing online.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. Let me just illustrate it by saying that each of us in our childhood, when we were little—when we were four, five or six—saw something that frightened us. Oddly enough, we never forget that throughout the rest of life, do we? That is what bad dreams are made of. We should remember that point, which is why those are wise words indeed.