Online Safety Bill (Fifth sitting) Debate
Full Debate: Read Full DebateKirsty Blackman
Main Page: Kirsty Blackman (Scottish National Party - Aberdeen North)Department Debates - View all Kirsty Blackman's debates with the Department for Digital, Culture, Media & Sport
(2 years, 6 months ago)
Public Bill CommitteesWe do not oppose clauses 2, 3 or 4, or the intentions of schedules 1 and 2, and have not sought to amend them at this stage, but this is an important opportunity to place on record some of the Opposition’s concerns as the Bill proceeds.
The first important thing to note is the broadness in the drafting of all the definitions. A service has links to the UK if it has a significant number of users in the UK, if the UK users are a target market, or if
“there are reasonable grounds to believe there is a material risk of significant harm to individuals”
in the UK using the service. Thus, territorially, a very wide range of online services could be caught. The Government have estimated in their impact assessment that 25,100 platforms will be in scope of the new regime, which is perhaps a conservative estimate. The impact assessment also notes that approximately 180,000 platforms could potentially be considered in scope of the Bill.
The provisions on extraterritorial jurisdiction are, again, extremely broad and could lead to some international platforms seeking to block UK users in a way similar to that seen following the introduction of GDPR. Furthermore, as has been the case under GDPR, those potentially in scope through the extraterritorial provisions may vigorously resist attempts to assert jurisdiction.
Notably absent from schedule 1 is an attempt to include or define how the Bill and its definitions of services that are exempt may adapt to emerging future technologies. The Minister may consider that a matter for secondary legislation, but as he knows, the Opposition feel that the Bill already leaves too many important matters to be determined at a later stage via statutory instruments. Although it good to see that the Bill has incorporated everyday internet behaviour such as a like or dislike button, as well as factoring in the use of emojis and symbols, it fails to consider how technologies such as artificial intelligence will sit within the framework as it stands.
It is quite right that there are exemptions for everyday user-to-user services such as email, SMS, and MMS services, and an all-important balance to strike between our fundamental right to privacy and keeping people safe online. That is where some difficult questions arise on platforms such as WhatsApp, which are embedded with end-to-end encryption as a standard feature. Concerns have been raised about Meta’s need to extend that feature to Instagram and Facebook Messenger.
The Opposition also have concerns about private messaging features more widely. Research from the Centre for Missing and Exploited Children highlighted the fact that a significant majority of online child abuse takes place in private messages. For example, 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Furthermore, recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people they have not met offline before. Nearly three quarters—74%—of cases of children contacted by someone they do not know initially take place by private message. We will address this issue further in new clause 20, but I wanted to highlight those exemptions early on, as they are relevant to schedule 1.
On a similar point, we remain concerned about how emerging online systems such as the metaverse have had no consideration in Bill as it stands. Only last week, colleagues will have read about a researcher from a non- profit organisation that seeks to limit the power of large corporations, SumOfUs, who claimed that she experienced sexual assault by a stranger in Meta’s virtual reality space, Horizon Worlds. The organisation’s report said:
“About an hour into using the platform, a SumOfUs researcher was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see—all while another user in the room watched and passed around a vodka bottle.”
There is currently no clear distinction about how these very real technologies will sit in the Bill more widely. Even more worryingly, there has been no consideration of how artificial intelligence systems such as Horizon Worlds, with clear user-to-user functions, fit within the exemptions in schedule 1. If we are to see exemptions for internal business services or services provided by public bodies, along with many others, as outlined in the schedule, we need to make sure that the exemptions are fit for purpose and in line with the rapidly evolving technology that is widely available overseas. Before long, I am sure that reality spaces such as Horizon Worlds will become more and more commonplace in the UK too.
I hope that the Minister can reassure us all of his plans to ensure that the Bill is adequately future-proofed to cope with the rising expansion of the online space. Although we do not formally oppose the provisions outlined in schedule 1, I hope that the Minister will see that there is much work to be done to ensure that the Bill is adequately future-proofed to ensure that the current exemptions are applicable to future technologies too.
Turning to schedule 2, the draft Bill was hugely lacking in provisions to tackle pornographic content, so it is a welcome step that we now see some attempts to tackle the rate at which pornographic content is easily accessed by children across the country. As we all know, the draft Bill only covered pornography websites that allow user-generated content such as OnlyFans. I am pleased to see that commercial pornography sites have now been brought within scope. This positive step forward has been made possible thanks to the incredible efforts of campaigning groups, of which there are far too many to mention, and from some of which we took evidence. I pay tribute to them today. Over the years, it is thanks to their persistence that the Government have been forced to take notice and take action.
Once again—I hate to repeat myself—I urge the Minister to consider how far the current definitions outlined in schedule 2 relating to regulated provider pornographic content will go to protect virtual technologies such as those I referred to earlier. We are seeing an increase in all types of pornographic and semi-pornographic content that draws on AI or virtual technology. An obvious example is the now thankfully defunct app that was making the rounds online in 2016 called DeepNude. While available, the app used neural networks to remove clothing from images of women, making them look realistically nude. The ramifications and potential for technology like this to take over the pornographic content space is essentially limitless.
I urge the Minister carefully to keep in mind the future of the online space as we proceed. More specifically, the regulation of pornographic content in the context of keeping children safe is an area where we can all surely get on board. The Opposition have no formal objection at this stage to the provisions outlined in schedule 2.
Thank you, Sir Roger, for chairing our sittings. It is a pleasure to be part of this Bill Committee. I have a couple of comments on clause 2 and more generally.
The Opposition spokesperson, the hon. Member for Pontypridd, made some points about making sure that we are future-proofing the Bill. There are some key issues where we need to make sure that we are not going backwards. That particularly includes private messaging. We need to make sure that the ability to use AI to find content that is illegal, involving child sexual abuse for example, in private messages is still included in the way that it is currently and that the Bill does not accidentally bar those very important safeguards from continuing. That is one way in which we need to be clear on the best means to go forward with the Bill.
Future-proofing is important—I absolutely agree that we need to ensure that the Bill either takes into account the metaverse and virtual reality or ensures that provisions can be amended in future to take into account the metaverse, virtual reality and any other emerging technologies that we do not know about and cannot even foresee today. I saw a meme online the other day that was somebody taking a selfie of themselves wearing a mask and it said, “Can you imagine if we had shown somebody this in 1995 and asked them what this was? They wouldn’t have had the faintest idea.” The internet changes so quickly that we need to ensure that the Bill is future-proofed, but we also need to make sure that it is today-proofed.
I still have concerns, which I raised on Second Reading, about whether the Bill adequately encompasses the online gaming world, where a huge number of children use the internet—and where they should use it—to interact with their friends in a safe way. A lot of online gaming is free from the bullying that can be seen in places such as WhatsApp, Snapchat and Instagram. We need to ensure that those safeguards are included for online gaming. Private messaging is a thing in a significant number of online games, but many people use oral communication—I am thinking of things such as Fortnite and Roblox, which is apparently a safe space, according to Roblox Corporation, but according to many researchers is a place where an awful lot of grooming takes place.
My other question for the Minister—I am not bothered if I do not get an answer today, as I would rather have a proper answer than the Minister try to come up with an answer right at this moment—is about what category the app store and the Google Play store fall into.
I am reluctant to do that. It is a technical fault and it is clearly undesirable, but I do not think we can suspend the Committee for the sake of a technical problem. Every member of the public who wishes to express an interest in these proceedings is able to be present if they choose to do so. Although I understand the hon. Lady’s concern, we have to continue. We will get it fixed as soon as we can.
You are making some really important points about the world of the internet and online gaming for children and young people. That is where we need some serious consideration about obligations on providers about media literacy for both children and grown-ups. Many people with children know that this is a really dangerous space for young people, but we are not quite sure we have enough information to understand what the threats, risks and harms are. That point about media literacy, particularly in regard to the gaming world, is really important.
Order. Before we proceed, the same rules apply in Committee as on the Floor of the House to this extent: the Chair is “you”, and you speak through the Chair, so it is “the hon. Lady”. [Interruption.] One moment.
While I am on my feet, I should perhaps have said earlier, and will now say for clarification, that interventions are permitted in exactly the same way as they are on the Floor of the House. In exactly the same way, it is up to the Member who has the Floor to decide whether to give way or not. The difference between these debates and those on the Floor of the House is of course that on the Floor of the House a Member can speak only once, whereas in Committee you have the opportunity to come back and speak again if you choose to do so. Once the Minister is winding up, that is the end of the debate. The Chair would not normally admit, except under exceptional circumstances, any further speech, as opposed to an intervention.
Thank you, Sir Roger.
I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.
I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?
That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.
I want to pick up on two issues, which I hope the Minister can clarify in his comments at the end of this section.
First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?
A number of people have raised concerns about freedom of speech in relation to end-to-end encryption. Does the right hon. Lady agree with me that, there should not be freedom of speech when it comes to child sexual abuse images, and that it is reasonable for those systems to check for child sexual abuse images?
The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?
Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.
I want to just put it on the record that the irony is not lost on me that we are having tech issues relating to the discussion of the Online Safety Bill. The Opposition have huge concerns regarding clause 5. We share the frustrations of stakeholders who have been working on these important issues for many years and who feel the Bill has been drafted in overly complex way. In its evidence, the Carnegie UK Trust outlined its concerns over the complexity of the Bill, which will likely lead to ineffective regulation for both service users and companies. While the Minister is fortunate to have a team of civil servants behind him, he will know that the Opposition sadly do not share the same level of resources—although I would like to place on the record my sincere thanks to my researcher, Freddie Cook, who is an army of one all by herself. Without her support, I would genuinely not know where I was today.
Complexity is an issue that crops up time and again when speaking with charities, stakeholders and civil society. We all recognise that the Bill will have a huge impact however it passes, but the complexity of its drafting is a huge barrier to implementation. The same can be said for the regulation. A Bill as complex as this is likely to lead to ineffective regulation for both service users and companies, who, for the first time, will be subject to specific requirements placed on them by the regulator. That being said, we absolutely support steps to ensure that providers of regulated user-to-user services and regulated search services have to abide by a duty of care regime, which will also see the regulator able to issue codes of practice.
I would also like to place on record my gratitude—lots of gratitude today—to Professor Lorna Woods and Will Perrin, who we heard from in evidence sessions last week. Alongside many others, they have been and continue to be an incredible source of knowledge and guidance for my team and for me as we seek to unpick the detail of this overly complex Bill. Colleagues will also be aware that Professor Woods and Mr Perrin originally developed the idea of a duty of care a few years ago now; their model was based on the idea that social media providers should be,
“seen as responsible for public space they have created, much as property owners or operators are in a physical world.”
It will come as no surprise to the Minister that Members of the Opposition fully fall behind that definition and firmly believe that forcing platforms to identify and act on harms that present a reasonable chance of risk is a positive step forward.
More broadly, we welcome moves by the Government to include specific duties on providers of services likely to be accessed by children, although I have some concerns about just how far they will stretch. Similarly, although I am sure we will come to address those matters in the debates that follow, we welcome steps to require Ofcom to issue codes of practice, but have fundamental concerns about how effective they will be if Ofcom is not allowed to remain fully independent and free from Government influence.
Lastly, on subsection 7, I imagine our debate on chapter 7 will be a key focus for Members. I know attempts to define key terms such as “priority content” will be a challenge for the Minister and his officials but we remain concerned that there are important omissions, which we will come to later. It is vital that those key terms are broad enough to encapsulate all the harms that we face online. Ultimately, what is illegal offline must be approached in the same way online if the Bill is to have any meaningful positive impact, which is ultimately what we all want.
I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.
I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.
In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.
Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.
The hon. Gentleman brings up an important point. We did hear about that in the evidence. I have no doubt the Secretary of State will not want to interfere in the workings of Ofcom. Having been in his position, I know there would be no desire for the Department to get involved in that, but I can understand why the Government might want the power to ensure things are working as they should. Perhaps the answer to the hon. Gentleman’s question is to have a standing committee scrutinising the effectiveness of the legislation and the way in which it is put into practice. That committee could be a further safeguard against what he implies: an unnecessary overreach of the Secretary of State’s powers.
Thank you, Sir Roger, for allowing me to intervene again. I was not expecting the standing committee issue to be brought up at this point, but I agree that there needs to be a post-implementation review of the Bill. I asked a series of written questions to Departments about post-legislative review and whether legislation that the Government have passed has had the intended effect. Most of the Departments that answered could not provide information on the number of post-legislative reviews. Of those that could provide me with the information, none of them had managed to do 100% of the post-implementation reviews that they were supposed to do.
It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.
I thank the hon. Lady for raising this point. Having also chaired a Select Committee, I can understand the sensitivities that this might fall under the current DCMS Committee, but the reality is that the Bill’s complexity and other pressures on the DCMS Committee means that this perhaps should be seen as an exceptional circumstance—in no way is that meant as a disrespect to that Select Committee, which is extremely effective in what it does.
I completely agree. Having sat on several Select Committees, I am aware of the tight timescales. There are not enough hours in the day for Select Committees to do everything that they would like to do. It would be unfortunate and undesirable were this matter to be one that fell between the cracks. Perhaps DCMS will bring forward more legislation in future that could fall between the cracks. If the Minister is willing to commit to a standing committee or anything in excess of the normal governmental procedures for review, that would be a step forward from the position that we are currently in. I look forward to hearing the Minister’s views on that.
I do not intend to speak to this specific point, but I wholeheartedly agree and will be happy to back amendment 69, should the hon. Lady press it to a vote.
In a moment.
For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.
I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.
I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.
I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.
A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.
Question put, That the amendment be made.
I want to talk about a few different things relating to the amendments. Speaking from the Opposition Front Bench, the hon. Member for Pontypridd covered in depth amendment 20, which relates to being directed to other content. Although this seems like a small amendment, it would apply in a significant number of different situations. Particular mention was made of Discord for gaming, but also of things such as moving from Facebook to Messenger—all those different directions that can happen. A huge number of those are important for those who would seek to abuse children online by trying to move from the higher-regulation services or ones with more foot traffic to areas with perhaps less moderation so as to attack children in more extreme ways.
I grew up on the internet and spent a huge amount of time speaking to people, so I am well aware that people can be anyone they want to be on the internet, and people do pretend to be lots of different people. If someone tells us their age on the internet, we cannot assume that that is in any way accurate. I am doing what I can to imprint that knowledge on my children in relation to any actions they are taking online. In terms of media literacy, which we will come on to discuss in more depth later, I hope that one of the key things that is being told to both children and adults is that it does not matter if people have pictures on their profile—they can be anybody that they want to online and could have taken those pictures from wherever.
In relation to amendment 21 on collaboration, the only reasonable concern that I have heard is about an action that was taken by Facebook in employing an outside company in the US. It employed an outside company that placed stories in local newspapers on concerns about vile things that were happening on TikTok. Those stories were invented—they were made up—specifically to harm TikTok’s reputation. I am not saying for a second that collaboration is bad, but I think the argument that some companies may make that it is bad because it causes them problems and their opponents may use it against them proves the need to have a regulator. The point of having a regulator is to ensure that any information or collaboration that is required is done in a way that, should a company decide to use it with malicious intent, the regulator can come down on them. The regulator ensures that the collaboration that we need to happen in order for emergent issues to be dealt with as quickly as possible is done in a way that does not harm people. If it does harm people, the regulator is there to take action.
I want to talk about amendments 25 and 30 on the production of images and child sexual abuse content. Amendment 30 should potentially have an “or” at the end rather than an “and”. However, I am very keen to support both of those amendments, and all the amendments relating to the production of child sexual abuse content. On the issues raised by the Opposition about livestreaming, for example, we heard two weeks ago about the percentage of self-generated child sexual abuse content. The fact is that 75% of that content is self-generated. That is absolutely huge.
If the Bill does not adequately cover production of the content, whether it is by children and young people who have been coerced into producing the content and using their cameras in that way, or whether it is in some other way, then the Bill fails to adequately protect our children. Purely on the basis of that 75% stat, which is so incredibly stark, it is completely reasonable that production is included. I would be happy to support the amendments in that regard; I think they are eminently sensible. Potentially, when the Bill was first written, production was not nearly so much of an issue. However, as it has moved on, it has become a huge issue and something that needs tackling. Like Opposition Members, I do not feel like the Bill covers production in as much detail as it should, in order to provide protection for children.
Amendment 10 would create a duty to publish the illegal content risk assessment, and proactively supply that to Ofcom. This is new legislation that is really a trial that will set international precedent, and a lot of the more prescriptive elements—which are necessary—are perhaps the most challenging parts of the Bill. The Minister has been very thoughtful on some of the issues, so I want to ask him, when we look at the landscape of how we look to regulate companies, where does he stand on transparency and accountability? How far is he willing to go, and how far does the Bill go, on issues of transparency? It is my feeling that the more companies are forced to publish and open up, the better. As we saw with the case of the Facebook whistleblower Frances Haugen, there is a lot to uncover. I therefore take this opportunity to ask the Minister how far the Bill goes on transparency and what his thoughts are on that.
All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.
Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.
If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.