(1 day, 18 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank you for chairing this debate, Mr Stringer, and I congratulate the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on bringing this debate to Westminster Hall. It is a subject we have talked about many times.
I want to make a number of points. The first is about safety by design. Page 1 of the Act states that the internet should be “safe by design”, yet everything that has happened since in the Act’s implementation, from the point of view of both Ofcom and the Government in respect of some of the secondary legislation, has not been about safety by design. It has been about regulating specific content, for example, and that is not where we should be. Much as I was happy that the Online Safety Act was passed, and I was worried about the perfect being the enemy of the good and all that, I am beginning to believe that the EU’s Digital Services Act will do a much better job of regulating, not least because the Government are failing to take enough action on this issue.
I am concerned that Ofcom, in collaboration with the Government, has managed to get us to a situation that makes nobody happy. It is not helpful for some of the tech companies. For example, category 1 is based solely on user numbers, which means that suicide forums, eating disorder platforms, doxing platforms and livestreaming platforms where self-generated child sexual abuse material is created are subject to exactly the same rules as a hill walking forum that gets three posts a week. In terms of proportionality, Ofcom is also failing the smallest platforms that are not risky, by requiring them to come to a three-day seminar on how to comply, when they might be run by a handful of volunteers spending a couple of hours a week looking after the forum and moderating every post. It will be very difficult for them to prove that children do not use their platforms, so there is no proportionality at either end of the spectrum.
In terms of where we are with the review, this is a very different Parliament from the one that began the conversations in the Joint Committee on the Draft Online Safety Bill. It felt like hardly anybody in these rooms knew anything about the online world or had any understanding of it. It is totally different now. There are so many MPs here who, for example, have an employment history of working hard to make improvements in this area. As the right hon. and learned Member said, we now have so much expertise in these rooms that we could act to ensure that the legislation worked properly. Rather than us constantly having to call these debates, the Government could rely on some of our expertise. They would not have to take on every one of a Joint Committee’s recommendations, for example, but they could rely on some of the expertise and the links that we have made over the years that we have been embedded in this area to help them make good decisions and ensure some level of safety by design.
Like so many Members in this place, I am concerned that the Act will not do what it is supposed to do. For me, the key thing was always keeping children safe online, whether that is about the commitments regularly given by the Government, which I wholeheartedly believe they wanted to fulfil, about hash matching to identify grooming behaviours, or about the doxing forums or suicide forums—those dark places of the internet—which will be subject to exactly the same rules as a hill walking forum. They are just going to fill in a risk assessment and say, “No children use our platform. There’s no risk on our platform, so it’s all good.” The Government had an opportunity to categorise them and they choose not to. I urge them to change their mind.
(3 weeks, 2 days ago)
General CommitteesIt is a pleasure to serve under your chairship, Sir Christopher. I am disappointed in this statutory instrument. I recognise the Minister’s acknowledgment of the small sites, high-harm issue, but the issue is far more important and we are missing an opportunity here. Can the Minister set out why the regulations as drafted do not follow the will of Parliament, accepted by the previous Government and written into the Act, that thresholds for categorisation can be based on risk or size? That was a long-argued point that went through many iterations.
The then Minister accepted the amendment that was put forward and said:
“many in the House have steadfastly campaigned on the issue of small but risky platforms.” —[Official Report, 12 September 2023; Vol. 737, c. 806.]
He confirmed that the legislation would now give the Secretary of State the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors, with the change ensuring that the framework was as flexible as possible in responding to the risk landscape. That has been thrown away in this new legislation. The Minister just said that we must do everything in our power, and yet the Government are throwing out a crucial change made to the Act to actually give them more power. They are getting rid of a power by changing this.
The amendment was to ensure that small sites dedicated to harm, such as sites providing information on suicide or self-harm or set up to target abuse and hatred at minority groups, like we saw in the riots in the summer, were subject to the fullest range of duties. When Ofcom published its advice, however, it disregarded this flexibility and advised that regulation should be laid bringing only the large platforms into category 1.
Is the hon. Member as concerned as I am that the Government seem to be ignoring the will of Parliament in their decision? Is he worried that young people particularly will suffer as a result?
Absolutely—I am. The Secretary of State’s decision to proceed with this narrow interpretation of the Online Safety Act provisions, and the failure to use the power they have to reject Ofcom’s imperfect advice, will allow small, risky platforms to continue to operate without the most stringent regulatory restrictions available. That leaves significant numbers of vulnerable users—women and individuals from minority groups—at risk of serious harm from targeted activity on these platforms.
I will set a few more questions for the Minister. How do His Majesty’s Government intend to assess whether Ofcom’s regulatory approach to small but high-harm sites is proving effective, and have any details been provided on Ofcom’s schedule of research about such sites? What assessment have the Government made of the different harms occurring on small, high-harm platforms? Have they broken this down by type of harm, and will they make such information available? Have the Government received legal advice about the use of service disruption orders for small but high-harm sites? Do the Government expect Ofcom to take enforcement action against small but high-harm sites, and have they made an assessment of the likely timescales for enforcement action? Will the Government set out criteria against which they expect Ofcom to keep its approach to small but high-harm sites under continual review, as set out in their draft statement of strategic priorities for online safety?
Was the Minister aware of the previous Government’s commitment that Select Committees in both Houses would be given the opportunity to scrutinise draft Online Safety Act statutory instruments before they were laid? If she was, why did that not happen in this case? Will she put on record her assurances that Online Safety Act statutory instruments will in future be shared with the relevant Committees before they are laid?
For all those reasons, I will vote against the motion.
I appreciate the opportunity to speak in this Committee, Sir Christopher. Like at least one other Member in the room, I lived the Online Safety Bill for a significant number of months—in fact, it seemed to drag on for years.
As the Minister said, the Online Safety Act is long overdue. We have needed this legislation for 30 years, since I was a kid using the internet in the early ’90s. There has always been the risk of harm on online platforms, and there have always been places where people can be radicalised and can see misogynistic content or content that children should never be able to see. In this case, legislation has moved significantly slower than society—I completely agree with the Minister about that—but that is not a reason for accepting the statutory instrument or agreeing with the proposed threshold conditions.
On the threshold conditions, I am unclear as to why the Government have chosen 34 million and 7 million for the average monthly active users. Is it 34 million because Reddit happens to have 35 million average UK users—is that why they have taken that decision? I absolutely believe that Reddit should be in scope of category 1, and I am pretty sure that Reddit believes it should be in scope of category 1 and have those additional duties. Reddit is one of the places where the functionalities and content recommendation services mean that people, no matter what age they are, can see incredibly harmful content. They can also see content that can be incredibly funny—a number of brilliant places on Reddit allow people can look at pictures of cats, which is my favourite way to use the internet—but there are dark places in Reddit forums, where people can end up going down rabbit holes. I therefore agree that platforms such as Reddit should be in scope of category 1.
The Minister spoke about schedule 11 and the changes that were made during the passage of the Act. The Minister is absolutely right. Paragraph 1(5) of that schedule states:
“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”
However, that does not undo the fact that we as legislators made a change to an earlier provision in that schedule. We fought for that incredibly hard and at every opportunity—in the Bill Committee, on the Floor of the House, in the recommitted Committee and in the House of Lords. At every stage, we voted for that change to be made, and significant numbers of outside organisations cared deeply about it. We wanted small high-risk platforms to be included. The provision that was added meant that the Secretary of State must make regulations relating to
“any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”
That was what the Government were willing to give us. It was not the original amendment that I moved in Bill Committee, which was specifically about small high-risk platforms, but it was enough to cover what we wanted.
What functionalities could and should be brought in scope? I believe that any service that allows users to livestream should be in the scope of category 1. We know that livestreaming is where the biggest increase in self-generated child sexual abuse material is. We know that livestreaming is incredibly dangerous, as people who are desperate to get access to child sexual abuse material can convince vulnerable young people and children to livestream. There is no delay where that content can be looked at and checked in advance of it being put up, yet the Government do not believe that every service that allows six-year-olds to livestream should be within the scope of category 1. The Government do not believe that those services should be subject to those additional safety duties, despite the fact that section 1 of the Online Safety Act 2023 says platforms should be “safe by design”. However, this is not creating platforms that are safe by design.
The regulations do not exclude young people from the ability to stream explicit videos to anyone because they only include services with over 34 million users, or over 7 million when it comes to content recommendation, and I agree that services in those cases are problematic. However, there are other really problematic services, causing life-changing—or in some cases, life-ending—problems for children, young people and vulnerable adults that will not be in the scope of category 1.
Generally, I am not a big fan of a lot of things that the UK Government have done; I have been on my feet, in the Chamber, arguing against a significant number of those things. This is one of the things that makes me most angry, because the Government, by putting forward this secondary legislation, are legislating in opposition to the will and intention of the Houses of Parliament. I know that we cannot bind a future Government or House, but this is not what was intended or agreed and moved on, nor what Royal Assent was given on; that was on the basis that we had assurances from Government Ministers that they would look at those functionalities and small but high-risk platforms.
For what Ofcom has put out in guidance and information on what it is doing on small but high-risk platforms, why are we not using everything that is available? Why are Government not willing to use everything available to them to bring those very high-risk platforms into the scope of category 1?
The changes that category 1 services would be required to make include additional duties; for a start, they are under more scrutiny—which is to be expected—and they are put on a specific list of category 1 services which will be published. That list of category 1 services includes platforms such as 4chan, that some people may have never heard of. Responsible parents will see that list and say, “Hold on a second. Why is 4chan on there? I don’t want my children to be going on there. It is clearly not a ginormous platform, therefore it must be on there because it is a high-risk service.” Parents will look at that list and talk to their children about those platforms. In terms of the category 1 list, never mind the additional duties, that would have a positive impact. Putting suicide forums on that list of category 1 services would have a positive impact on the behaviour of parents, children, and the teachers who teach those young people how to access the internet safely.
I guarantee that a significant number of teachers and people that are involved with young people have never heard of 4chan, but putting it on that list would give them an additional tool to enable them to approach young people and talk about the ways in which they use the internet.
I thank the hon. Lady for speaking so passionately on this matter. As the Liberal Democrat mental health spokesperson, something that we are increasingly coming across is that it is not just adults asking children to livestream, but children, peer-to-peer, who do not realise that it is illegal. As the hon. Lady touched on, the mental health impact is huge but also lifelong. Someone can have a digital footprint that they can never get rid of, and children who are uninformed and uneducated to the impacts of their decisions could be affected decades into the future.
I completely agree. That is an additional reason why livestreaming is one of my biggest concerns. That functionality should have been included as a matter of course. Any of the organisations that deal with young people and the removal of child sexual abuse material online, such as the Internet Watch Foundation, will tell you that livestreaming is a huge concern. The hon. Member is 100% correct.
That is the way I talk to my children about online safety: once something is put online—once it is on the internet—it cannot ever be taken back. It is there forever, no matter what anyone does about it, and young people may not have the capacity to understand that. If systems were safe by design, young people simply would not have access to livestreaming at all; they would not have access to that functionality, so there would be that moment of thinking before they do something. They would not be able to do peer-to-peer livestreaming that can then be shared among the entire school and the entire world.
We know from research that a significant number of child sexual abuse materials are impossible to take down. Young people may put their own images online or somebody else may share them without their consent. Organisations such as the Internet Watch Foundation do everything they can to try to take down that content, but it is like playing whack-a-mole; it comes up and up and up. Once they have fallen into that trap, the content cannot be taken back. If we were being safe by design, we would ensure, as far as possible—as far as the Government could do, we could do or Ofcom could do—that no young person would be able to access that functionality. As I said, it should have been included.
I appreciate what the Government said about content recommendation and the algorithms that are used to ensure that people stay on platforms for a significant length of time. I do not know how many Members have spent much time on TikTok, but people can start watching videos of cats and still be there an hour and a half later. The algorithms are there to try to keep us on the platform. They are there because, actually, the platforms make money from our seeing the advertisements. They want us to see exciting content. Part of the issue with the content recommendation referenced in the conditions is that platforms are serving more and more exciting and extreme content to try to keep us there for longer, so we end up with people being radicalised on these platforms—possibly not intentionally by the platforms, but because their algorithm serves more and more extreme content.
I agree that that content should have the lower threshold in terms of the number of users. I am not sure about the numbers of the thresholds, but I think the Government have that differentiation correct, particularly on the addictive nature of algorithmic content. However, they are failing on incredibly high-risk content. The additional duties for category 1 services involve a number of different things: illegal content risk assessments, duties relating to terms of service, children’s risk assessments, adult empowerment duties and record-keeping duties. As I said, the fact that those category 1-ranked platforms will be on a list is powerful in itself, but adding those additional duties is really important.
Let us say that somebody is undertaking a risky business—piercing, for example. Even though not many people get piercings in the grand scheme of things, the Government require piercing organisations to jump through additional hoops because they are involved in dangerous things that carry a risk of infection and other associated risks. They are required to meet hygiene regulations, register with environmental health and have checks of their records to ensure that they know who is being provided with piercings, because it is a risky thing. The Government are putting additional duties on them because they recognise that piercing is risky and potentially harmful.
However, the Government are choosing not to put additional duties on incredibly high-risk platforms. They are choosing not to do that. They have been given the right to do that. Parliament has made its will very clear: “We want the Government to take action over those small high-risk platforms.” I do not care how many hoops 4chan has to jump through. Give it as many hoops as possible; it is an incredibly harmful site, and there are many others out there—hon. Members mentioned suicide forums, for example. Make them jump through every single hoop. If we cannot ban them outright—which would be my preferred option—make them keep records, make them have adult-empowerment duties, and put them on a list of organisations that we, the Government or Ofcom reckon are harmful.
If we end up in a situation where, due to the failures of this Act, young people commit suicide, and the platform is not categorised properly, there is then a reduction in the amount of protections, and in the information that they have to provide about deceased children to the families, because they are not categorised as category 1 or 2B. We could end up in a situation where a young person dies as a result of being radicalised on a forum—because the Government decided it should not be in scope—but that platform does not even have to provide the deceased child’s family with access to that online usage. That is shocking, right? If the Government are not willing to take the proper action required, at least bring these platforms into the scope of the actions and requirements related to deceased children.
I appreciate that I have taken a significant length of time—although not nearly as long as the Online Safety Act has taken to pass, I hasten to say—but I am absolutely serious about the fact that I am really, really angry about this. This is endangering children. This is endangering young people. This is turning the Online Safety Act back into what some people suggested it should be at the beginning, an anti-Facebook and anti-Twitter Act, or a regulation of Facebook and Twitter— or X—Act, rather than something that genuinely creates what it says in section 1 of the Act: an online world that is “safe by design”.
This is not creating an online world that is safe by design; this is opening young people and vulnerable adults up to far more risks than it should. The Government are wilfully making this choice, and we are giving them the opportunity to undo this and to choose to make the right decision—the decision that Parliament has asked them to make—to include functionalities such as livestreaming, and to include those high-risk platforms that we know radicalise people and put them at a higher risk of death.
I thank all Members for their very powerful contributions to the debate. This instrument will bring us one step closer to a safer online world for our citizens. It is clearer than ever that it is desperately needed: transparency, accountability and user empowerment matter now more than ever.
The Opposition spokesperson, the hon. Member for Huntingdon, asked whether we agree on the need for companies not to wait for the duties in the Act to be implemented, but to ensure that safety is baked in from the start. I absolutely agree, and he will be aware that the Secretary of State has made that point on many occasions. He also raised the issue of proportionality. I confirm that many of the duties on categorised services are subject to the principle of proportionality, which requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity, and in some cases duties are based on the assessment of risk of harm presented by the service.
For example, in determining what is proportionate for the user empowerment duties on content for category 1 services, the findings of the most recent user empowerment assessments are relevant. They include the incidence of relevant content on the service in addition to the size and capacity of the provider. Where a code of practice is relevant to a duty, Ofcom must have regard to the principles on proportionality, and what is proportionate for one kind of service might not be for another.
The hon. Member for Huntingdon is absolutely right that the pornography review has been completed. The Government are reviewing that at the moment and will publish it in due course.
In response to the hon. Members for Newton Abbot and for Aberdeen North (Kirsty Blackman) and to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), when the Online Safety Act was introduced, category 1 thresholds were due to be assessed based on the level of risk and harm for adults—as the Members read out very clearly. That was removed during the passage of the Bill by the previous Government.
As things stand, although Baroness Morgan’s successful amendment made it possible for threshold conditions to be based solely on functionalities, it did not change the basis of Ofcom’s research, which for category 1 is easy, quick and wide dissemination of content. The Secretary of State had to consider that. I will repeat that for all Members to hear again: the Secretary of State has to act within the powers given to him in schedule 11 when setting out the threshold and conditions. The powers do not allow for thresholds to be determined by another body, as per the amendment.
Although the hon. Member for Aberdeen North very powerfully read out the Act, it very clearly sets out that it does not actually do what she is asking for it to do. We absolutely agree that small but risky sites need to be covered, but as it stands, the Secretary of State does not have the powers to include them.
Sorry, I have lots of points to cover. If I have not covered the hon Member’s concerns in my response, she is more than welcome to intervene later.
These small but risky services are of significant concern to the Government, and they will still have to protect against illegal content and, where relevant, content that is harmful to children. Ofcom also has a dedicated taskforce to go after them. I hope that answers the hon. Member’s question.
The hon. Member for Newton Abbot also raised the review of Ofcom’s approach. The regulator has already trialled an approach of targeting small but risky services through its regulation of video-sharing platforms. Indeed, a number of those services improved their policies and content moderation in response. All the adult platforms under the VSP regime, large and small, have implemented age verification through this route to ensure that under-18s cannot access pornography on their services. In instances where services fail to make necessary changes, they will face formal enforcement action from Ofcom. Ofcom has a proven track record and the Government have every faith in its ability to take action against non-compliant services.
The hon. Member also raised issues around how Ofcom will enforce action against small but risky services. Ofcom will have robust enforcement powers available to use against companies that fail to fulfil their duties and it will be able to issue enforcement decisions. Action can include fines of up to £18 million or 10% of qualifying worldwide revenue in the relevant year, whichever is higher, and Ofcom can direct companies to take specific steps to comply with its regulation.
I am going to make some progress. On livestreaming, Ofcom considered that functionality, but concluded that the key functionalities that spread content easily, quickly and widely are content recommender systems and forwarding or resharing user-generated content.
Services accessed by children must still be safe by design, regardless of whether they are categorised. Small but risky services will also still be required to comply with illegal content duties. The hon. Member for Aberdeen North should be well aware of that as she raised concerns on that issue.
On child safety, there were questions about how online safety protects children from harmful content. The Act requires all services in scope to proactively remove and prevent users from being exposed to priority illegal content, such as illegal suicide content and child sexual exploitation and abuse material. That is already within the remit.
In addition, companies that are likely to be accessed by children will need to take steps to protect children from harmful content and behaviour on their services, including content that is legal but none the less presents a risk of harm to children. The Act designates content that promotes suicide or self-harm as in the category of primary priority content that is harmful to children. Parents and children will also be able to report pro-suicide or pro-self-harm content to the platform and the reporting mechanism will need to be easy to navigate for child users. On 8 May, Ofcom published its draft children’s safety codes of conduct, in which it proposed measures that companies should employ to protect children from suicide and self-harm content, as well as other content.
Finally, on why category 1 is not based on risk, such as the risk of hate speech, when the Act was introduced, category 1 thresholds were due to be assessed on the level of risk of harm to adults from priority content disseminated by means of that service. As I said earlier, that was removed during the Act’s passage by the then Government and replaced with consideration of the likely functionalities and how easily, quickly and widely user-generated content is disseminated, which is a significant change. Although the Government understand that that approach has its critics, who argue that the risk of harm is the most significant factor, that is the position under the Act.
The Minister is making the case that the Secretary of State’s hands are tied by the Act —that it requires stuff in relation to the number of users. Can she tell us in which part of the Act it says that, because it does not say that? If she can tell us where it is in the Act, I am quite willing to sit down and shut up about this point, but it is not in the Act.
The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.
The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.
I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.
Thank you, Sir Christopher—I appreciate that prod. I did look at Standing Orders this morning, but could not find that bit, so that is incredibly helpful.
On what the Minister said about schedule 11 and the notes that she has been passed from her team on that point, I appreciate her commitment to share the Government’s legal advice. That will be incredibly helpful; it would have been helpful to have it in advance of this Committee.
In schedule 11, it says:
“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”
Perhaps I cannot read English, or perhaps the Minister, her legal advisers and the team at DSIT read it in a different way from me, but the Secretary of State having to take something into account and the Secretary of State being bound by something are two different things—they are not the same. It does not say that the Secretary of State must regulate only on the specific number of users.
In fact, schedule 11 says earlier that the Secretary of State
“must make regulations specifying conditions…for the user-to-user part of regulated user-to-user services relating to each of the following”,
which are the
“number of users…functionalities of that part of the service, and…any other characteristics of that part of the service or factors”.
The Secretary of State must therefore make regulations in relation to any other characteristics of that part of the service or factors
“relating to that part of the service that the Secretary of State considers relevant.”
He must do that, but he must only take into account the number of users. The Government, however, have decided that taking into account is much more important than “must” do something. They have decided to do that despite Parliament being pretty clear in the language it has used.
I am not terribly happy with the Online Safety Act. It is a lot better than the situation we have currently, but it is far from perfect. As the Minister said, I argued in favour of keeping the stuff about legal but harmful content for adults. I argued against the then Government’s position on that, but the Act is the Act that we have.
The Minister’s point does not make sense. The Secretary of State has to take into account the number of users and how quickly things are disseminated, but he must make regulations about functionalities or factors that he considers relevant. Therefore, it seems that he does not consider suicide forums and livestreaming to be relevant; if he did, he would surely be bound by the “must” and would have to make regulations about them. It is frustrating that the Act does not do what it is supposed to do and does not protect young people from livestreaming. The Minister said that it protects people from seeing that illegal content, but it does not prevent them from creating it.
The Government could make regulations so that every platform that has a livestreaming functionality, or even every platform that has child users on it—there is a lot in the Act about the proportion of children who use a service—is automatically included in category 1 because they consider them to be high risk.
It would not be right for either of us to ask the Minister to disclose legal advice—that clearly would not be appropriate—but I am grateful for the Minister’s offer to share a slightly more expansive description of why the Government have come to the conclusion that they have.
On the hon. Lady’s point about what the Act actually says, we have both quoted paragraph 1(5) of schedule 11, which deals with whether the language that has found its way into the ministerial statement is the be-all and end-all of the Minister’s conclusions. We both think it is not. If it is the case, as I think the Minister is arguing, that the ability to disseminate “easily, quickly and widely” is essentially a synonym for the scale of the service and the number of its users, what does the hon. Lady think of the amendment that Baroness Morgan made in the other place to paragraph 1(4), which says that when the regulations we are considering specify
“the way or ways in which the relevant conditions are met”,
for category 1 threshold conditions
“at least one specified condition about number of users or functionality must be met”?
The crucial word that was added is “or”. If the number of users were required to establish what the hon. Lady has described, the word “or” would be inappropriate.
I absolutely agree, and that is a helpful clarification.
If the Government have decided that it is too difficult to regulate high-risk platforms as category 1, and that they do not matter enough because they do not have enough of an impact, they should stand up and tell us that. Rather than saying that their hands have been tied by the Act—they manifestly have not—they need to take ownership of their actions. If they have decided that such platforms are not important enough or that they cannot be bothered having a fight with Ofcom about that, they should be honest and say, “This is the position we have decided to take.” Instead, they are standing up and saying, “Our hands have been tied,” but that is just not correct: their hands have not been tied by the Act.
I appreciate that the Minister will get in touch with me about the legal advice, but it will be too late. This statutory instrument will have been through the process by that time, and people will have been put at risk as a result of the Government’s failure. They have the power to take action in relation to functionalities and factors, and in relation to suicide forums, livestreaming and the creation of child sexual abuse material, and they are choosing not to.
If the Government have decided that it is too difficult to do that, that those platforms are not risky enough and that not enough people are being harmed by them, they need to hold their hands up and say, “We’ve decided that this is the position we are going to take.” They must not hide behind the legislation, which does not say what they are telling us it says. They should just be honest about the fact that they have decided that they cannot be bothered to take action. They cannot be bothered to have a fight with Ofcom because it is not important enough. Hiding behind the legislation is incredibly cowardly—it does not say that.
(3 weeks, 3 days ago)
Commons ChamberUrgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.
Each Urgent Question requires a Government Minister to give a response on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Not only have we set aside £520 million precisely to be able to invest in the life sciences industry with an innovation fund, we are very keen to work with specific businesses to understand how they can make more secure, long-term investment. The single most important thing for most people making an investment in the UK is whether they believe there is political, fiscal and financial stability in the UK. That is what we are absolutely determined to deliver. My hon. Friend makes a very good point about those who are immunosuppressed for all sorts of different reasons, whether their medication or a condition. I will take that point back to the Department.
The Chancellor said that economic growth is the most important thing and this was an opportunity to get some of that economic growth. This was an opportunity to get something over the line and the UK Government failed to deliver it. How can the House and the public trust anything the UK Government say? How can they say that this is the founding mission if they then fail to deliver for a region that could really do with that economic growth?
The thing is that spending taxpayers’ money has to be proven to be good value for money. That is why, whenever we are making an investment such as this, we have to make sure it delivers more return on investment than £1 for £1. When AstraZeneca made the decision to cut the R&D part of its budget from £150 million to £90 million, it made sense for the UK Government to look again at the amount of money we could legitimately put in on behalf of the taxpayer. If the hon. Lady had been in my place, I think she would have made exactly the same decision.
(3 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank my hon. Friend for raising the really important—indeed, deeply concerning—issue of the rise of anti-women hate, with the perpetrators marketing themselves as successful men.
What we are seeing is that boys look at such videos and do not agree with everything that is said, but little nuggets make sense to them. For me, it is about the relentless bombardment: if someone sees one video like that, they might think, “Oh right,” and not look at it properly, but they are relentlessly targeted by the same messaging over and over again.
That is true not just for misogynistic hate speech, but for body image material. Girls and boys are seeing unrealistic expectations of body image, which are often completely fake and contain fake messaging, but which make them reflect on their own bodies in a negative way, when they may not have had those thoughts before.
I want to drive home that being 14 years old is tough. I am really old now compared with being 14, but I can truly say to anybody who is aged 14 watching this: “It gets better!” It is hard to be a 14-year-old: they are exploring their body and exploring new challenges. Their hormones are going wild and their peers are going through exactly the same thing. It is tough, and school is tough. It is natural for children and young people to question their identity, their role in the world, their sexuality, or whatever it is they might be exploring—that is normal—but I am concerned that that bombardment of unhealthy, unregulated and toxic messaging at a crucial time, when teenagers’ brains are developing, is frankly leading to a crisis.
I return to an earlier point about whether the parts of apps or platforms that children are using are actually safe for them to use. There are different parts of apps that we all use—we may not all be tech-savvy, but we do use them—but when we drill into them and take a minute to ask, “Is this safe for children?”, the answer for me is, “No.”
There are features such as the live location functionality, which comes up a lot on apps, such as when someone is using a maps app and it asks for their live location so they can see how to get from A to B. That is totally fine, but there are certain social media apps that children use that have their live location on permanently. They can toggle it to turn it off, but when I asked children in Darlington why they did not turn it off, they said there is a peer pressure to keep it on—it is seen as really uncool to turn it off. It is also about being able to see whether someone has read a message or not.
I then said to those children, “Okay, but those apps are safe because you only accept people you know,” and they said, “Oh no, I’ve got thousands and thousands of people on that app, and it takes me ages to remove each person, because I can’t remember if I know them, so I don’t do it.” They just leave their location on for thousands of people, many of whom may be void accounts, and they do not even know if they are active any more. The point is that we would not allow our children to go into a space where their location was shown to lots of strangers all the time. Those children who I spoke to also said that the live location feature on some of these apps is leading to in-person bullying and attacks. That is absolutely horrifying.
On that point, is the hon. Member aware that if someone toggles their location off on Snapchat, for example, it constantly—in fact, every time the app is opened—says, “You’re on ghost mode. Do you want to turn your location back on?” So every single time someone opens the app, it tries to convince them to turn their location back on.
I thank the hon. Member for raising that issue, because there are lots of different nudge notifications. We can understand why, because it is an unregulated space and the app is trying to get as much data as possible—if we are not paying for the service, we are the service. We all know that as adults, but the young people and children who we are talking about today do not know that their data is what makes them attractive to that app.
I could talk for hours on this subject, Mr Dowd, but, do not worry, I will not. There are a number of things that I would like to say. Not many Members present sat through the majority of the Online Safety Bill Committee as it went through Parliament, but I was in every one of those meetings, listening to various views and debating online safety.
I will touch on one issue that the hon. Member for Darlington (Lola McEvoy) raised in her excellent and important speech. I agree with almost everything she said. Not many people in Parliament have her level of passion or knowledge about the subject, so I appreciate her bringing forward the debate.
On the issue of features, I totally agree with the hon. Member and I moved an amendment to that effect during the Bill’s progress. There should be restrictions on the features that children should be able to access. She was talking about safety by design, so that children do not have to see content that they cannot unsee, do not have to experience the issues that they cannot un-experience, cannot be contacted by external people who they do not know, and cannot livestream. We have seen an increase in the amount of self-generated child sexual abuse material and livestreaming is a massive proportion of that.
Yesterday, a local organisation in Aberdeen called CyberSafe Scotland launched a report on its work in 10 of our primary schools with 1,300 children aged between 10 and 12—primary school children, not secondary school children. Some 300 of those children wrote what is called a “name it”, where they named a problem that they had seen online. Last night, we were able to read some of the issues that they had raised. Pervasive misogyny is everywhere online, and it is normalised. It is not just in some of the videos that they see and it is not just about the Andrew Tates of this world—it is absolutely everywhere. A couple of years ago there was a trend in online videos of young men asking girls to behave like slaves, and that was all over the place.
Children are seeing a different online world from the one that we experience because they have different algorithms and have different things pushed at them. They are playing Roblox and Fortnite, but most of us are not playing those games. I am still concerned that the Online Safety Act does not adequately cover all of the online gaming world, which is where children are spending a significant proportion of their time online.
A huge amount more needs to be done to ensure that children are safe online. There is not enough in place about reviewing the online safety legislation, which Members on both sides of the House pushed for to ensure that the legislation is kept as up to date as possible. The online world changes very rapidly: the scams that were happening nine months ago are totally different from those happening today. I am still concerned that the Act focuses too much on the regulation of Facebook, for example, rather than the regulation of the online world that our children actually experience. CyberSafe Scotland intentionally centred the views and rights of young people in its work, which meant that the programmes that it delivered in schools were much more appropriate and children were much better able to listen and react to them.
The last thing that I will mention is Girlguiding and its girls’ attitude survey. It is published on an annual basis and shows a huge increase in the number of girls who feel unsafe. That is because of the online world they are experiencing. We have a huge amount of responsibility here, and I appreciate the hon. Member for Darlington bringing the debate forward today.
I will keep this to an informal four-minute limit. Regrettably, if Members speak beyond that, I will have to introduce a formal figure.