Consideration of Lords amendments
Clause 82
General duties of OFCOM under section 3 of the Communications Act
14:11
Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Lords amendment 349, and Government amendments (a) and (b).

Lords amendment 391, Government amendment (a), and Government consequential amendment (a).

Lords amendment 17, Government motion to disagree, and Government amendments (a) and (b) in lieu.

Amendment (i) to Government amendment (a) in lieu of Lords amendment 17.

Lords amendment 20, and Government motion to disagree.

Lords amendment 22, and Government motion to disagree.

Lords amendment 81, Government motion to disagree, and Government amendments (a) to (c) in lieu.

Lords amendment 148, Government motion to disagree, and Government amendment (a) in lieu.

Lords amendment 1, and amendments (a) and (b).

Lords amendments 2 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181 and 183 to 188.

Lords amendment 189, and amendment (a) in lieu.

Lords amendments 190 to 216.

Lords amendment 217, and amendment (a).

Lords amendments 218 to 227.

Lords amendment 228, and amendment (a).

Lords amendments 229 and 230.

Lords amendment 231, and amendment (a).

Lords amendments 232 to 319.

Lords amendment 320, and amendment (a).

Lords amendment 321, and amendment (a).

Lords amendments 322 to 348, 350 to 390 and 392 to 424.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As we know from proceedings in this place, the Online Safety Bill is incredibly important. I am delighted that it is returning to the Commons in great shape, having gone through extensive and thorough scrutiny in the Lords. The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them. The Bill will also guard children against perpetrators of abhorrent child sexual exploitation and abuse, and ensure that tech companies take responsibility for tackling such content on their platforms, or be held criminally accountable.

William Cash Portrait Sir William Cash (Stone) (Con)
- Hansard - - - Excerpts

As I am sure my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) will agree, may I say how much we appreciate what the Government have done in relation to the matter just referred to? As the Minister knows, we withdrew our amendment in the House of Commons after discussion, and we had amazingly constructive discussions with the Government right the way through, and also in the House of Lords. I shall refer to that if I am called to speak later, but I simply wanted to put on record our thanks, because this will save so many children’s lives.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my hon. Friend and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for all their work on this. I hope that this debate will show that we have listened and tried to work with everybody, including on this important part of the Bill. We have not been able to capture absolutely everything that everybody wants, but we are all determined to ensure that the Bill gets on the statute book as quickly as possible, to ensure that we start the important work of implementing it.

We have amended the Bill to bolster its provisions. A number of topics have been of particular interest in the other place. Following engagement with colleagues on those issues, we have bolstered the Bill’s protections for children, including a significant package of changes relating to age assurance. We have also enhanced protections for adult users.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend will know that Ministers and officials in his Department have worked extensively—I thank them for that—with me, Baroness Kidron, and the Bereaved Families for Online Safety group, on the amendment that will make it easier for coroners to have access to data from online companies in the tragic cases where that might be a cause of a child’s death. He will also know that there will still be gaps in legislation, but such gaps could be closed by further measures in the Data Protection and Digital Information Bill. His ministerial colleague in the other place has committed the Government to that, so may I invite my hon. Friend to set out more about the Government’s plans for doing just that?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend for his work on this, and Baroness Kidron for her work. I will cover that in more detail in a moment, but we remain committed to exploring measures that would facilitate better access to data for coroners under specific circumstances. We are looking for the best vehicle to do that, which includes those possibilities in the Data Protection and Digital Information Bill. We want to ensure that the protections for adult users afford people greater control over their online experience.

14:15
The Bill will ensure that Ofcom has the powers it needs to ensure that coroners are provided with the information they need following such a tragedy. As well as my right hon. Friend and Baroness Kidron, that provision was also championed by Ian Russell and other bereaved parents with whom we have worked closely, to ensure that we get the right solution. I am grateful for their tireless efforts. We have made sure that we can address the concerns raised by Members about the risks relating to the design and functionality of services, because this is a complicated issue, for a number of reasons that have been well rehearsed. The changes I have outlined will ensure that the Bill contains the strongest possible protections for children, that users’ rights to freedom of expression and privacy are protected, and that services are transparent and accountable.
Let me go into more detail on the Government amendments that were passed during the Bill’s passage through the Lords, and the amendments that I present to the House today. As I have said, child safety is a key priority in the Bill, and during its passage through the Lords we have further strengthened its protections for children. That has included placing the categories of “primary priority” and “priority” content that is harmful to children in the Bill. That will provide companies and Ofcom with explicit and early confirmation on the kind of content that children must be protected from, rather than addressing those issues later via secondary legislation. Providers of the largest services will also be required to publish summaries of their risk assessments for illegal content and content that is harmful to children. That will empower children and their parents or carers to clearly understand the risks to children presented by such services.
The Government listened to the views expressed in both Houses and introduced new offences in Committee that will more effectively hold technology companies to account if they fail to protect children. Ofcom will now be able to hold companies and senior managers, where they are at fault, criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.
John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

The Minister is setting out a powerful case for how the Government have listened to the overtures in this place and the other place. Further to the interventions from my hon. Friend the Member for Stone (Sir William Cash) and my right hon. Friend the Member for Bromsgrove (Sajid Javid), the former Culture Secretary, will the Minister be clear that the risk here is under-regulation, not over-regulation? Although the internet may be widely used by perfectly good people, the people who run internet companies are anything but daft and more likely to be dastardly.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This is a difficult path to tread in approaching this issue for the first time. In many ways, these are things that we should have done 10 or 15 years ago, as social media platforms and people’s engagement with them proliferated over that period. Regulation has to be done gently, but it must be done. We must act now and get it right, to ensure that we hold the big technology companies in particular to account, while also understanding the massive benefits that those technology companies and their products provide.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I agree with the Minister that this is a groundbreaking Bill, but we must be clear that there are still gaps. Given what he is saying about the requirements for regulation of online social media companies and other platforms, how will he monitor, over a period of time, whether the measures that we have are as dynamic as they need to be to catch up with social media as it develops?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady asks an important question, and that is the essence of what we are doing. We have tried to make this Bill flexible and proportionate. It is not technology specific, so that it is as future-proofed as possible. We must obviously lean into Ofcom as it seeks to operationalise the Act once the Bill gains Royal Assent. Ofcom will come back with its reporting, so not only will Government and the Department be a check on this, but Parliament will be able to assess the efficacy of the Bill as the system beds in and as technology and the various platforms move on and develop.

I talked about the offences, and I will just finalise my point about criminal liability. Those offences will be punishable with up to two years in prison.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

Further to that point about the remaining gaps in the Bill, I appreciate what the Minister says about this area being a moving target. Everybody—not just in this country, but around the world—is having to learn as the internet evolves.

I thank the Minister for Government amendment 241, which deals with provenance and understanding where information posted on the web comes from, and allows people therefore to check whether they want to see it, if it comes from dubious sources. That is an example of a collective harm—of people posting disinformation and misinformation online and attempting to subvert our democratic processes, among other things. I park with him, if I may, the notion that we will have to come back to that area in particular. It is an area where the Bill is particularly weak, notwithstanding all the good stuff it does elsewhere, notably on the areas he has mentioned. I hope that everyone in this House accepts that that area will need to be revisited in due course.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Undoubtedly we will have to come back to that point. Not everything needs to be in the Bill at this point. We have industry initiatives, such as Adobe’s content security policy, which are good initiatives in themselves, but as we better understand misinformation, disinformation, deepfakes and the proliferation and repetition of fake images, fake text and fake news, we will need to keep ensuring we can stay ahead of the game, as my hon. Friend said. That is why we have made the legislation flexible.

Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I have two things to ask. First, will the Minister spell out more clearly how Parliament will be able to monitor the implementation? What mechanisms do we have to do that? Secondly, on director liability, which I warmly welcome—I am pleased that the Government have listened to Back Benchers on this issue—does he not agree that the example we have set in the Bill should be copied in other Bills, such as the Economic Crime and Corporate Transparency Bill, where a similar proposal exists from Back Benchers across the House?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.

On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.

I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.

There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.

There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.

Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.

We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.

We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.

14:30
The offence of controlling or coercive behaviour has been added as a priority offence and will require companies to proactively tackle such content and activity that disproportionately affects women and girls. The Bill also introduces new offences relating to intimate image abuse, including criminalising deepfakes for the first time in England and Wales. Those new offences to protect women and girls sit alongside other changes that we have made to the criminal law to ensure that it is fit for purpose in the modern age. For example, we have also introduced a new communications offence of intentionally encouraging or assisting serious self-harm. Our amendments will also require platforms to remove the most harmful self-harm content for all users. The offence has been designed to avoid criminalising or removing recovery and support content.
The Government are committed to empowering adults online and made changes to the Bill to strengthen the user empowerment content duties. First, we have introduced a new content assessment duty in relation to the main user empowerment duties. That will require big tech platforms to carry out comprehensive assessments of the prevalence of content that falls in scope of their providers’ user empowerment duties on their services, such as legal content that encourages suicide or an act of self-harm. They will need to keep a record of that assessment and publish a summary of it for their users in their terms of service. The new duty will underpin the main duties to offer user empowerment tools, ensuring that platforms and users have a comprehensive understanding of the relevant types of content on their services.
Secondly, where category 1 providers offer the user empowerment tools, we have further strengthened the duties on them by requiring them to proactively ask their registered adult users whether they wish to use the user empowerment content features. That will help to make the tools easier for users to opt into or out of. This approach continues to balance the empowerment of users and the protection of freedom of expression by avoiding the “default on” approach.
Baroness Fraser of Craigmaddie made amendments in the other place that aligned the definition of the term “freedom of expression” in the Bill with that in the European convention on human rights. That also reflects the approach of other UK legislation, including the Higher Education (Freedom of Speech) Act 2023. Those amendments will increase clarity about freedom of expression in the Bill.
The Government recognise the difficulties that coroners and bereaved families have when seeking to understand the circumstances surrounding a child’s death and have introduced a number of amendments to address those issues; I have outlined a bit of those. First, we expanded Ofcom’s information gathering powers so that the regulator can require information from regulated services about a deceased child’s online activity following a request from a coroner. That is backed up by Ofcom’s existing enforcement powers. We have also given Ofcom the power to produce a bespoke report for the coroner and enabled the regulator to share information with a coroner without the prior consent of a business to disclose. That will ensure that Ofcom can collect such information and share it with the coroner where appropriate, so that coroners have access to the expertise and information they need to conduct their investigations.
Finally, we have introduced amendments to ensure that the process for accessing data regarding the online activities of a deceased child is more straightforward and humane. The largest companies must set out policies on disclosure of such data in a clear, accessible and sufficiently detailed format in their terms of service. They must also respond in writing in a timely manner, provide a dedicated means for parents to communicate with the company and put in place a mechanism for parents to complain if they consider that a service is not meeting its obligations.
We recognise the valuable work of researchers in improving our collective understanding of online safety issues, which is why we have made amendments to the Bill that require Ofcom to publish its report into researcher access to information within 18 months rather than two years. Ofcom will then be required to publish guidance on the issue, setting out best practice for platforms to share information in a way that supports their research functions while protecting user privacy and commercially sensitive material. While we will not be making additional changes to the Bill during the remainder of its passage, we understand the call for further actions in this area. That is why we have made a commitment to explore this issue further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill.
The Government heard the House’s concerns about the risk of algorithms and their impact on our interactions online. Given the influence they can have, the regulator must be able to scrutinise the algorithms’ functionalities and other systems and processes that providers use. We have therefore made changes to provide Ofcom with the power to authorise a person to view specific types of information remotely: information demonstrating the operation of a provider’s systems, processes or features, including algorithms, and tests or demonstrations. There are substantial safeguards around the use of that power, which include: Ofcom’s legal duty to exercise it proportionately; a seven-day notification period; and the legal requirement to comply with data protection rules and regulations.
The Government are grateful to Baroness Morgan of Cotes and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who like many in the House have steadfastly campaigned on the issue of small but risky platforms. We have accepted an amendment to the Bill that changes the rules for establishing the conditions that determine which services will be designated as category 1 or category 2B services and thus have additional duties. In making the regulations used to determine which services are category 1 or category 2B, the Secretary of State will now have the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors. Previously, the Secretary of State was required to set the threshold based on a combination of both factors.
It is still the expectation that only the most high risk user-to-user services will be designated as category 1 services. However, the change will ensure that the framework is as flexible as possible in responding to the risk landscape. I say to my hon. Friend the Member for Yeovil (Mr Fysh), who I know will speak later, that it is not meant to capture user-to-user systems; it is very much about content but not about stifling innovation in areas such as distributed ledgers and so on.
Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I am grateful for the amendment, which I think is important. Will the Minister make it clear that he will not accept the amendments tabled by the hon. Member for Yeovil (Mr Fysh).

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed, we will not be accepting those amendments, but I will cover more of that later on, after I have listened to the comments that I know my hon. Friend wants to make.

As a result of the amendment, we have also made a small change to clause 98—the emerging category 1 services list—to ensure that it makes operational sense. Prior to Baroness Morgan’s amendment, a service had to meet the functionality threshold for content and 75% of the user number threshold to be on the emerging services list. Under the amended cause, there is now a plausible scenario where a service could meet the category 1 threshold without meeting any condition based on user numbers, so we had to make the change to ensure that the clause worked in that scenario.

We have always been clear that the design of a service, its functionalities and its other features are key drivers of risk that impact on the risk of harm to children. Baroness Kidron’s amendments 17, 20, 22 and 81 seek to treat those aspects as sources of harm in and of themselves. Although we agree with the objective, we are concerned that they do not work within the legislative framework and risk legal confusion and delaying the Bill. We have worked closely with Baroness Kidron and other parliamentarians to identify alternative ways to make the role that design and functionalities play more explicit. I am grateful to colleagues in both Houses for being so generous with their time on this issue. In particular, I thank again my right hon. and learned Friend the Member for Kenilworth and Southam for his tireless work, which was crucial in enabling the creation of an alternative and mutually satisfactory package of amendments. We will disagree to Lords amendments 17, 20, 22 and 81 and replace them with amendments that make it explicit that providers are required to assess the impact that service design, functionalities and other features have on the risk of harm to children.

On Report, my hon. Friend the Member for Crawley (Henry Smith) raised animal abuse on the internet and asked how we might address such harmful content. I am pleased that the changes we have since made to the Bill fully demonstrate the Government’s commitment to tackling criminal activity relating to animal torture online. It is a cause that Baroness Merron has championed passionately. Her amendment in the other place sought to require the Secretary of State to review certain offences and, depending on the review’s outcome, to list them as priority offences in schedule 7. To accelerate measures to tackle such content, the Government will remove clause 63—the review clause—and instead immediately list section 4(1) of the Animal Welfare Act 2006 as a priority offence. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the Royal Society for the Prevention of Cruelty to Animals and are confident that the offence of unnecessary suffering will capture a broad swathe of behaviour. I hope the whole House will recognise our efforts and those of Baroness Merron and support the amendment.

You will be pleased to know, Mr Deputy Speaker, that I will conclude my remarks. I express my gratitude to my esteemed colleagues both here and in the other place for their continued and dedicated engagement with this complicated, complex Bill during the course of its parliamentary passage. I strongly believe that the Bill, in this form, strikes the right balance in providing the strongest possible protections for both adults and children online while protecting freedom of expression. The Government have listened carefully to the views of Members on both sides of the House, stakeholders and members of the public. The amendments we have made during the Bill’s progress through the Lords have further enhanced its robust and world-leading legislative framework. It is groundbreaking and will ensure the safety of generations to come. I ask Members of the House gathered here to support the Government’s position on the issues that I have spoken about today.

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

I call the Opposition spokesperson.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - - - Excerpts

Before I address the amendments at hand, let me first put on record my thanks for the incredible efforts of our colleagues in the other place. The Bill has gone on a huge journey. The Government have repeatedly delayed its passage, and even went to great effort to recommit parts of the Bill to Committee in an attempt to remove important provisions on legal but harmful content. For those reasons alone, it is somewhat of a miracle that we have arrived at this moment, with a Bill that I am glad to say is in a much better place than when we last debated it here. That is thanks to the tireless work of so many individuals, charities and organisations, which have come together to coalesce around important provisions that will have a positive impact on people’s lives online.

Today, we have the real privilege of being joined by Ian Russell, Stuart Stephens, Emilia Stevens, Hollie Dance and Lisa Kenevan, who have all been impacted by losing a child at the hands of online harm. I want to take a moment to give my most heartfelt thanks to them all, and to the other families who have shared their stories, insights and experiences with colleagues and me as the Bill progressed. Today, in our thoughts are Archie, Isaac, Olly, Molly and all the other children who were taken due to online harm. Today, their legacy stands before us. We would not be here without you, so thank you.

We also could not have arrived at this point without the determination of colleagues in the other place, notably Baroness Kidron. Colleagues will know that she has been an extremely passionate, determined and effective voice for children throughout, and the Bill is stronger today thanks to her efforts. More broadly, I hope that today’s debate will be a significant and poignant moment for everyone who has been fighting hard for more protections online for many years.

It is good to see the Minister in his place. This is a complex Bill, and has been the responsibility of many of his colleagues since its introduction to Parliament. That being said, it will come as no surprise that Labour is pleased with some of the significant concessions that the Government have made on the Bill. Many stem from amendments the Opposition sought to make early on in the Bill’s passage. Although his Department’s press release may try to claim a united front, let us be clear: the Bill has sadly fallen victory to Tory infighting from day one. The Conservatives truly cannot decide if they are the party of protecting children or of free speech, when they should be the party of both. Sadly, some colleagues on the Government Benches have tried to stop the Bill in its tracks entirely, but Labour has always supported the need for it. We have worked collaboratively with the Government and have long called for these important changes. It is a welcome relief that the Government have finally listened.

Let me also be clear that the Bill goes some way to regulate the online space in the past and present, but it makes no effort to future-proof or anticipate emerging harms. The Labour party has repeatedly warned the Government of our concerns that, thanks to the Bill’s focus on content rather than social media platforms’ business models, it may not go far enough. With that in mind, I echo calls from across the House. Will the Minister commit to a review of the legislation within five years of enactment, to ensure that it has met their objective of making the UK the safest place in the world to be online?

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making an important speech. It is clear that the Government want to tackle harmful suicide and self-harm content. It is also clear that the Bill does not go far enough. Does she agree that we should support Samaritans’ suggested way forward after implementation? We need the Government to engage with people with lived experience of suicide and self-harm, to ensure that the new legislation makes things better. If it is shown—as we fear—not to go far enough, new legislative approaches will be required to supplement and take it further, to ensure that the internet is as safe as possible for vulnerable people of all ages.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank my hon. Friend for that intervention. He has been a passionate advocate on that point, speaking on behalf of his constituent Joe Nihill and his family for more protections in the Bill. It is clear that we need to know whether the legislation works in practice. Parliamentary oversight of that is essential, so I echo calls around the Chamber for that review. How will it take place? What will it look like? Parliament must have oversight, so that we know whether the legislation is fit for purpose.

14:45
Let me turn to the amendments. Labour is particularly pleased to see the Government follow the excellent lead of Baroness Kidron in the other place by addressing the alarming gaps in children’s risk assessments. Those amendments will go some way to ensuring that social media platforms have to carefully consider harmful content both created and disseminated to children when using their services. This is an incredibly important point. With online content being constantly available and drip-fed to children thanks to autoplay features, it is right that risk assessments relating to harm will have to include widespread provisions.
I wonder, however, if the Minister could explain one particular point. The Government’s own press releases have long lauded the Bill as focused on child safety. Indeed, in the Secretary of State’s open letter in December to all parents, carers and guardians, she notes:
“The strongest protections in this legislation are for children and young people.”
I would therefore be interested to hear from the Minister exactly why the Bill makes no specific reference to children’s rights, or more specifically the UN convention on the rights of the child. It is the most widely ratified international human rights treaty in history, yet it is missing from the Bill. I hope the Minister can clarify that for us all.
It will come as no surprise that Labour is proud that the Government have conceded on an important amendment that will see social media sites required to proactively remove animal torture content online. I first raised that issue in Committee more than a year ago. Vile animal torture content has no place online or in society. I am proud that it was the Labour party that first recognised the alarming gap in the Government’s earlier draft of the Bill. Research from the RSPCA showed that, in 2021, there were 756 reports of animal abuse on social media, compared with 431 in 2020 and 157 in 2019. We can all see that this horrific content is more widespread and common than we might initially have believed. Thanks to Labour party colleagues in the other place, particularly Baroness Merron, it will no longer be tolerated online.
I am particularly proud to see the Government adopt an amendment that represents a move towards a risk-based approach to service categorisation. This is an important point and a significant concession from the Government. Along with many others, I repeatedly warned the Government that, by focusing on a size versus risk approach to categorisation, the online safety regime was doomed to fail. Put simply, we could have been left in a position where some of the most harmful websites and platforms, including 4chan and BitChute, which regularly host and promote far right, antisemitic content, slipped through the cracks of the legislation. None of us wanted that to happen, but in May 2022 the then Minister chose not to accept a cross-party amendment in Committee that could have resolved the issue more than a year ago.
We are pleased to see progress, and thank colleagues in the other place for seeing sense, but that approach highlights the Government’s wider strategy for online safety: one based on delay and indecision. If we needed more proof, we only have to turn to the Government’s approach to allow researcher access to data relating to the online safety regime. Labour welcomes small Government amendments in the other place on this point, but there are real-world consequences if the Government do not consider improving levels of transparency. Other jurisdictions across the globe are looking at strengthening their data transparency provisions because they recognise that regulators such as Ofcom need academics and civil society to have sight of data in the most complex of cases. In Australia and Canada, there is real progress. Our friends across the pond in the USA have recently signed a deal with the EU that will see them committed to working together on researcher access to data.
The Secretary of State talks a good game about our world-leading universities and research environment, and claims that she wants the UK to be a leader, yet inaction is putting our country and our homegrown talent pool at a disadvantage. Let us be clear: access to data goes further than academics. In the last month, Elon Musk has sought to sue the Centre for Countering Digital Hate and the Anti-Defamation League, organisations filled to the brim with brilliant British research excellence. I recognise that the Government have made a loose commitment to go further in future legislation, but that has not been formally outlined. I sincerely hope that the promises made to bereaved parents about further progress in the Data Protection and Digital Information Bill are kept. I would be grateful for some reassurance from the Minister on that point.
Labour has long campaigned for stronger protections to keep people—both children and adults—safe online. The Bill has made remarkable progress, thanks to our colleagues in the other place coming together and genuinely putting people’s priorities over party politics and political gain. Labour has always supported an online safety regime, and has sought to work with the Government while raising valid concerns carefully throughout. We all want to keep people safe and there is broad consensus that social media companies have failed to regulate themselves. Labour is proud of the changes it has developed and pushed on, but this is not the end. We will continue to push the Government to deliver this regime in good time and to ensure that it is reviewed regularly and has appropriate parliamentary oversight. After all, children and adults across the UK deserve, as a first priority, to be kept safe. The Minister knows we will be closely watching.
None Portrait Several hon. Members rose—
- Hansard -

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

I call the Chair of the Select Committee.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- View Speech - Hansard - - - Excerpts

I welcome the return of the Online Safety Bill from its exhaustive consideration in the other place. As the Minister knows, this vital legislation kicked off several years ago under the leadership of my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), with the ambitious aim of making the UK the safest place in the world to go online. While other countries picked at the edges of that, we were the first place in the world to set ourselves such an ambitious task.

The legislation is mammoth in size and globally significant in scope. Its delivery has been long-winded and I am so pleased that we have got to where we are now. As one of the Ministers who carried the baton for this legislation for around 19 months, I understand the balance to be struck between freedom of speech campaigners, charities and the large pressures from the platforms to get this right.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

I commend my hon. Friend for her remarks. May I point out that there is a provision in European legislation—I speak as Chairman of the European Scrutiny Committee—called the data services protection arrangements? They have nothing to compare with what we have in the Bill. That demonstrates the fact that when we legislate for ourselves we can get it right. That is something people ought to bear in mind.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

My hon. Friend is absolutely right to point that out. Much of the European legislation on this was taken from our own draft legislation, but has not gone anywhere near as far in the protections it offers.

We know that the internet is magnificent and life changing in so many ways, but that the dark corners present such a serious concern for children and scores of other vulnerable people. I associate myself with the comments of the hon. Member for Pontypridd (Alex Davies-Jones) on the child protection campaigners who have worked so incredibly hard on this issue, particularly those who have experienced the unimaginable tragedy of losing children as a result of what they have seen in the online world. To turn an unspeakable tragedy of that nature into a campaign to save the lives of others is the ultimate thing to do, and they deserve our massive thanks and gratitude.

I am also grateful to so many of our noble colleagues who have shaped the Bill using their unique knowledge and expertise. I would like to mention a few of them and welcome the changes they brought, but also thank the Minister and the Government for accepting so many of the challenges they brought forward and adapting them into the Bill. We all owe a massive debt of gratitude to Baroness Kidron for her tireless campaign for children’s protections. A children’s safety stalwart and pioneer for many years, virtually no one else knows more about this vital issue. It is absolutely right that the cornerstone and priority of the Bill must be to protect children. The Minister mentioned that the statistics are absolutely horrible and disturbing. That is why it is important that the Secretary of State will now be able to require providers to retain data relating to child sexual exploitation and abuse, ensuring that law enforcement does not have one hand tied behind its back when it comes to investigating these terrible crimes.

I also welcome the commitment to the new powers given to Ofcom and the expectations of providers regarding access to content and information in the terrible event of the death of a child. The tragic suicide of Molly Russell, the long and valiant battle of her dad, Ian, to get access to the social media content that played such a key role in it, and the delay that brought to the inquest, is the only example we need of why this is absolutely the right thing to do. I know Baroness Kidron played a big part in that, as did my right hon. Friend the Member for Bromsgrove (Sajid Javid).

I am still concerned that there are not enough protections for vulnerable adults or for when people reach the cliff-edge of the age of 18. People of all ages need protection from extremely harmful content online. I am still not 100% convinced that user empowerment tools will provide that, but I look forward to being proved wrong.

I welcome the news that Ofcom is now required to produce guidance setting out how companies can tackle online violence against women and girls and demonstrate best practice. I am thankful to the former Equalities Minister, Baroness Morgan of Cotes, for her work on that. It is a vital piece of the puzzle that was missing from the original Bill, which did not specifically mention women or girls at all as far as I can remember.

It is important to stay faithful to the original thread of the Bill. To futureproof it, it has to be about systems and processes, rather than specific threats, but the simple fact is that the online world is so much more hostile for women. For black women, it is even worse. Illegal activity such as stalking and harassment is a daily occurrence for so many women and girls online. Over one in 10 women in England have experienced online violence and three in 10 have witnessed it. We also know that women and girls are disproportionately affected by the abuse of intimate images and the sharing of deepfakes, so it is welcome that those will become an offence. I also welcome that controlling and coercive behaviour, which has been made a recognised offence in real life, will now be listed as a priority offence online. That is something else the Government should take pride in.

I thank Baroness Merron for bringing animal welfare into the scope of the Bill. All in-scope platforms will have proactive duties to tackle content amounting to the offence of causing unnecessary suffering of animals. I thank Ministers for taking that on board. Anyone who victimises beings smaller and weaker than themselves, whether children or animals, is the most despicable kind of coward. It shows the level of depravity in parts of the online world that the act of hurting animals for pleasure is even a thing. A recent BBC story uncovered the torture of baby monkeys in Indonesia. The fact that individuals in the UK and the US are profiting from that, and that it was shared on platforms like Facebook is horrifying.

In the brief time left available to me, I must admit to still being a bit confused over the Government’s stance on end-to-end encryption. It sounds like the Minister has acknowledged that there is no sufficiently accurate and privacy-preserving technology currently in existence, and that the last resort power would only come into effect once the technology was there. Technically, that means the Government have not moved on the requirement of Ofcom to use last resort powers. Many security experts believe it could be many years before any such technology is developed, if ever, and that worries me. I am, of course, very supportive of protecting user privacy, but it is also fundamentally right that terrorism or child sexual exploitation rings should not be able to proliferate unhindered on these channels. The right to privacy must be trumped by the need to stop events that could lead to mass death and the harm of innocent adults and children. As my hon. Friend the Member for Folkestone and Hythe (Damian Collins) said, that is also against their terms of service. I would therefore welcome it if the Minister were to make a couple of comments on that.

I also welcome the changes brought forward by Baroness Morgan of Cotes on the categorisation of harm. I, too, have been one of the long-standing voices over successive stages of the Bill saying that a platform’s size should not be the only measure of harm. Clearly, massive platforms, by definition of their reach, have huge potential to spread harmful content, but we know that online platforms can go viral overnight. We know there are some small but incredibly pernicious platforms out there. Surely the harmful content on a site should be the definer of how harmful it is, not just its size. I welcome the increased flexibility for the Secretary of State to set a threshold based on the number of users, or the functionality offered, or both. I would love to know a little more about how that would work in practice.

We were the first country in the world to set out the ambitious target of comprehensive online safety legislation. Since then, so much time has passed. Other countries and the EU have legislated while we have refined and in the meantime so much harm has been able to proliferate. We now need to get this done. We are so close to getting this legislation over the finish line. Can the Minister assure me that we are sending out a very clear message to providers that they must start their work now? They must not necessarily wait for this legislation to be in place because people are suffering while the delays happen.

I put on record my thanks to Members of this House and the other place who have worked so hard to get the legislation into such a great state, and to Ministers who have listened very carefully to all their suggestions and expertise. Finally, I put on record my thanks to the incredible Government officials. I was responsible for shepherding the Bill for a mere 19 months. It nearly finished me off, but some officials have been involved in it right from the beginning. They deserve our enormous gratitude for everything they have done.

None Portrait Several hon. Members rose—
- Hansard -

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

Order. Thirteen Members wish to participate in the debate. The winding-up speeches will need to start shortly before 5 pm, and the Minister has indicated that he has quite a bit to say. I therefore suggest a self-denying ordinance of between seven and eight minutes following the speech from the Scottish National party spokesman. It is up to colleagues, because we have not imposed a mandatory time limit at this stage, but if Members are sensible and not greedy, everyone should get in with no difficulty.

14:59
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - - - Excerpts

It is a pleasure to speak during what I hope are the final stages of the Bill. Given that nearly all the Bills on which I have spoken up to now have been money Bills, this business of “coming back from the Lords” and scrutinising Lords amendments has not been part of my experience, so if I get anything wrong, I apologise.

Like other Members, I want to begin by thanking a number of people and organisations, including the Mental Health Foundation, Carnegie UK, the Internet Watch Foundation, the National Society for the Prevention of Cruelty to Children and two researchers for the SNP, Aaron Lucas and Josh Simmonds-Upton, for all their work, advice, knowledge and wisdom. I also join the hon. Members for Pontypridd (Alex Davies-Jones) and for Gosport (Dame Caroline Dinenage) in thanking the families involved for the huge amount of time and energy—and the huge amount of themselves—that they have had to pour into the process in order to secure these changes. This is the beginning of the culmination of all their hard work. It will make a difference today, and it will make a difference when the Bill is enacted. Members in all parts of the House will do what we can to continue to scrutinise its operation to ensure that it works as intended, to ensure that children are kept as safe as possible online, and to ensure that Ofcom uses these powers to persuade platforms to provide the information that they will be required to provide following the death of a child about that child’s use of social media.

The Bill is about keeping people safe. It is a different Bill from the one that began its parliamentary journey, I think, more than two years ago. I have seen various Ministers leading from the Dispatch Box during that time, but the voices around the Chamber have been consistent, from the Conservative, Labour and SNP Benches. All the Members who have spoken have agreed that we want the internet to be a safer place. I am extremely glad that the Government have made so many concessions that the Opposition parties called for. I congratulate the hon. Member for Pontypridd on the inclusion of violence against women and girls in the Bill. She championed that in Committee, and I am glad that the Government have made the change.

Another change that the Government have made relates to small high-risk platforms. Back in May or June last year I tabled amendments 80, 81 and 82, which called for that categorisation to be changed so that it was not based just on the number of users. I think it was the hon. Member for Gosport who mentioned 4chan, and I have mentioned Kiwi Farms a number of times in the Chamber. Such organisations cannot be allowed to get away with horrific, vile content that encourages violence. They cannot be allowed a lower bar just because they have a smaller number of users.

The National Risk Register produced by the Cabinet Office—great bedtime reading which I thoroughly recommend—states that both the risk and the likelihood of harm and the number of people on whom it will have an impact should be taken into account before a decision is made. It is therefore entirely sensible for the Government to take into account both the number of users, when it is a significant number, and the extremely high risk of harm caused by some of these providers.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

The hon. Lady is making an excellent speech, but it is critical to understand that this is not just about wickedness that would have taken place anyway but is now taking place on the internet; it is about the internet catalysing and exaggerating that wickedness, and spawning and encouraging all kinds of malevolence. We have a big responsibility in this place to regulate, control and indeed stop this, and the hon. Lady is right to emphasise that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The right hon. Gentleman is entirely correct. Whether it involves a particularly right-wing cause or antisemitism—or, indeed, dieting content that drags people into something more radical in relation to eating disorders—the bubble mentality created by these algorithms massively increases the risk of radicalisation, and we therefore have an increased duty to protect people.

As I have said, I am pleased to see the positive changes that have been made as a result of Opposition pressure and the uncompromising efforts of those in the House of Lords, especially Baroness Kidron, who has been nothing short of tenacious. Throughout the time in which we have been discussing the Bill, I have spoken to Members of both Houses about it, and it has been very unusual to come across anyone who knows what they are talking about, and, in particular, has the incredible depth of knowledge, understanding and wisdom shown by Baroness Kidron. I was able to speak to her as someone who practically grew up on the internet—we had it at home when I was eight—but she knew far more about it than I did. I am extremely pleased that the Government have worked with her to improve the Bill, and have accepted that she has a huge breadth of knowledge. She managed to do what we did not quite manage to do in this House, although hopefully we laid the foundations.

I want to refer to a number of points that were mentioned by the Minister and are also mentioned in the letters that the Government provided relating to the Lords amendments. Algorithmic scrutiny is incredibly important, and I, along with other Members, have raised it a number of times—again, in connection with concern about radicalisation. Some organisations have been doing better things recently. For instance, someone who searches for something may begin to go down a rabbit hole. Some companies are now putting up a flag, for instance a video, suggesting that users are going down a dark hole and should look at something a bit lighter, and directing them away from the autoplaying of the more radical content. If all organisations, or at least a significant number—particularly those with high traffic—can be encouraged to take such action rather than allowing people to be driven to more extreme content, that will be a positive step.

I was pleased to hear about the upcoming researcher access report, and about the report on app stores. I asked a previous Minister about app stores a year or so ago, and the Minister said that they were not included, and that was the end of it. Given the risk that is posed by app stores, the fact that they were not categorised as user-to-user content concerned me greatly. Someone who wants to put something on an Apple app store has to jump through Apple’s hoops. The content is not owned by the app store, and the same applies to some of the material on the PlayStation store. It is owned by the person who created the content, and it is therefore user-to-user content. In some cases, it is created by one individual. There is no ongoing review of that. Age-rating is another issue: app stores choose whatever age they happen to decide is the most important. Some of the dating apps, such as match.com, have been active in that regard, and have made it clear that their platforms are not for under-16s or under-18s, while the app store has rated the content as being for a younger age than the users’ actual age. That is of concern, especially if the companies are trying to improve age-rating.

On the subject of age rating, I am pleased to see more in the Bill about age assurance and the frameworks. I am particularly pleased to see what is going to happen in relation to trying to stop children being able to access pornography. That is incredibly important but it had been missing from the Bill. I understand that Baroness Floella Benjamin has done a huge amount of work on pushing this forward and ensuring that parliamentarians are briefed on it, and I thank her for the work that she has done. Human trafficking has also been included. Again, that was something that we pushed for, and I am glad to see that it has been put on the face of the Bill.

I want to talk briefly about the review mechanisms, then I will go on to talk about end-to-end encryption. I am still concerned that the review mechanisms are not strong enough. We have pushed to have a parliamentary Committee convened, for example, to review this legislation. This is the fastest moving area of life. Things are changing so dramatically. How many people in here had even heard of ChatGPT a year and a half ago? How many people had used a virtual reality headset? How many people had accessed Rec Room of any of the other VR systems? I understand that the Government have genuinely tried their best to make the Bill as future-proof as possible, but we have no parliamentary scrutiny mechanisms written in. I am not trying to undermine the work of the Committee on this—I think it is incredibly important—but Select Committees are busy and they have no legislative power in this regard. If the Government had written in a review, that would have been incredibly helpful.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- Hansard - - - Excerpts

The hon. Lady is making a very good speech. When I first came to this House, which was rather a long time ago now, there was a Companies Act every year, because company law was changing at the time, as was the nature of post-war capitalism. It seems to me that there is a strong argument for an annual Act on the handling and management of the internet. What she is saying is exactly right, and that is probably where we will end up.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely support the right hon. Member’s point—I would love to see this happening on an annual basis. I am sure that the Ministers who have shepherded the Bill through would be terrified of that, and that the Government team sitting over there are probably quaking in their boots at the suggestion, but given how fast this moves, I think that this would be incredibly important.

The Government’s record on post-implementation reviews of legislation is pretty shoddy. If you ask Government Departments what percentage of legislation they have put through a post-implementation review in the timeline they were supposed to, they will say that it is very small. Some Departments are a bit better than others, but given the number of reshuffles there have been, some do not even know which pieces of legislation they are supposed to be post-implementation reviewing. I am concerned that this legislation will get lost, and that there is no legislative back-up to any of the mechanisms for reviewing it. The Minister has said that it will be kept under review, but can we have some sort of governmental commitment that an actual review will take place, and that legislation will be forthcoming if necessary, to ensure that the implementation of this Bill is carried out as intended? We are not necessarily asking the Government to change it; we are just asking them to cover all the things that they intend it to cover.

On end-to-end encryption, on child sexual exploitation and abuse materials, and on the last resort provider—I have been consistent with every Minister I have spoken to across the Dispatch Box and every time I have spoken to hon. Members about this—when there is any use of child sexual exploitation material or child sexual abuse material, we should be able to require the provider to find it. That absolutely trumps privacy. The largest increase in child sexual abuse material is in self-generated content. That is horrific. We are seeing a massive increase in that number. We need providers to be able to search—using the hash numbers that they can categorise images with, or however they want to do it—for people who are sharing this material in order to allow the authorities to arrest them and put them behind bars so that they cannot cause any more harm to children. That is more important than any privacy concerns. Although Ministers have not put it in the Bill until this point, they have, to their credit, been clear that that is more important than any privacy concerns, and that protecting children trumps those concerns when it comes to abuse materials and exploitation. I am glad to see that that is now written into the Bill; it is important that it was not just stated at the Dispatch Box, even though it was mentioned by a number of Members.

15:15
I have spoken about the huge number of online harms, the huge number of issues with accessing the internet and the huge number of concerns we have for the future, but I also need to say that the internet is a wonderful place. It is absolutely great to be able to go and play online games. It is great to be able to have a community of people that I can speak to online and have a conversation with. It is good that the Government have included in some of the risks the issues about platforms where adults can contact children, for example. That was another thing I addressed during the course of the amendments. It is great that people can find their tribe online in a way that they perhaps cannot do in real life. It is brilliant that people can have a try-out at being someone else online. That is not about trying to confuse or upset people or about catfishing. Sometimes we need to have a wee bit of self-exploration in order to try and work out who we are. There are so many positive aspects of the internet, but we need to ensure that children and the most vulnerable adults in particular are kept safe online.
This is not the perfect Bill. This is not necessarily the Bill that I would have liked to see. It has gone through so many changes and iterations over the time we have been trying to scrutinise it that some of it has gone back to what it previously looked like, except the harmful content in relation to adults. I am pleased that the internet will be a safer place for our children and our children’s children. I am pleased that they will have more protections online. I have an amount of faith and cautious optimism in the work of Ofcom, because of how fast it has been scaling up and because of the incredible people it has employed to work there—they really know what they are talking about. I wish the Government and Ofcom every success in ensuring that the Bill is embedded and ensuring that the internet is as safe as possible. I would just really like a commitment from the Minister on ensuring that this legislation is kept under proper review and that legislative change will be made, should we identify any loopholes.
Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

The draft Bill was published in April 2021, so it is fantastic that we are now discussing its final stages after it has gone through its processes in the House of Lords. It went through pre-legislative scrutiny, then it was introduced here, committed to the Bill Committee, recommitted, came back to the House, went to the Lords and came back again. I do not think any Bill has had as much scrutiny and debate over such a long period of time as this one has had. Hon. Members have disagreed on it from time to time, but the spirit and motivation at every stage have never been political; it has been about trying to make the Bill the best it can possibly be. We have ended up with a process that has seen it get better through all its stages.

Picking up on the comments of the hon. Member for Aberdeen North (Kirsty Blackman) and others, the question of ongoing scrutiny of the regime is an important one. In the pre-legislative scrutiny Committee—the Joint Committee that I chaired—there was a recommendation that there should be a post-legislative scrutiny Committee or a new Joint Committee, perhaps for a limited period. The pre-legislative scrutiny Committee benefited enormously from being a Committee of both Houses. Baroness Kidron has rightly been mentioned by Members today and she is watching us today from the Gallery. She is keeping her scrutiny of the passage of the Bill going from her position of advantage in the Gallery.

We have discussed a number of new technologies during the Bill’s passage that were not discussed at all on Second Reading because they were not live, including the metaverse and large language models. We are reassured that the Bill is futureproof, but we will not know until we come across such things. Ongoing scrutiny of the regime, the codes of practice and Ofcom’s risk registers is more than any one Select Committee can do. The Government have previously spoken favourably of the idea of post-legislative scrutiny, and it would be good if the Minister could say whether that is still under consideration.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend makes a powerful point, echoing the comments of Members on both sides of the House. He is absolutely right that, as well as the scale and character of internet harms, their dynamism is a feature that Governments must take seriously. The problem, it seems to me, is that the pace of technological change, in this area and in others, does not fit easily with the thoroughness of the democratic legislative process; we tend to want to do things at length, because we want to scrutinise them properly, and that takes time. How does my hon. Friend square that in his own mind, and what would he recommend to the Government?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The length of the process we have gone through on this Bill is a good thing, because we have ended up with probably the most comprehensive legislation in the world. We have a regulator with more power, and more power to sanction, than anywhere else. It is important to get that right.

A lot of the regulation is principle-based. It is about the regulation of user-to-user services, whereby people share things with each other through an intermediary service. Technology will develop, but those principles will underpin a lot of it. There will be specific cases where we need to think about whether the regulatory oversight works in a metaverse environment in which we are dealing with harms created by speech that has no footprint. How do we monitor and scrutinise that?

One of the hardest challenges could be making sure that companies continue to use appropriate technology to identify and mitigate harms on their platforms. The problem we have had with the regime to date is that we have relied on self-reporting from the technology companies on what is or is not possible. Indeed, the debate about end-to-end encryption is another example. The companies are saying that, if they share too much data, there is a danger that it will break encryption, but they will not say what data they gather or how they use it. For example, they will not say how they identify illegal use of their platform. Can they see the messages that people have sent after they have sent them? They will not publicly acknowledge it, and they will not say what data they gather and what triggers they could use to intervene, but the regulator will now have the right to see them. That principle of accountability and the power of the regulator to scrutinise are the two things that make me confident that this will work, but we may need to make amendments because of new things that we have not yet thought about.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

In addition to the idea of annual scrutiny raised by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), does my hon. Friend think it would be a reasonably good idea for the Select Committee on Culture, Media and Sport to set up a Sub-Committee under its Standing Orders to keep any eye on this stuff? My hon. Friend was a great Chairman of that Select Committee, and such a Sub-Committee would allow the annual monitoring of all the things that could go wrong, and it could also try to keep up with the pace of change.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

When I chaired the Digital, Culture, Media and Sport Committee, we set up a Sub-Committee to consider these issues and internet regulation. Of course, the Sub-Committee has the same members. It is up to the Select Committee to determine how it structures itself and spends its time, but there is only so much that any one departmental Select Committee can do among its huge range of other responsibilities. It might be worth thinking about a special Committee, drawing on the powers and knowledge of both Houses, but that is not a matter for the Bill. As my hon. Friend knows, it is a matter of amending the Standing Orders of the House, and the House must decide that it wants to create such a Committee. I think it is something we should consider.

We must make sure that encrypted services have proper transparency and accountability, and we must bring in skilled experts. Members have talked about researcher access to the companies’ data and information, and it cannot be a free-for-all; there has to be a process by which a researcher applies to get privileged access to a company’s information. Indeed, as part of responding to Ofcom’s risk registers, a company could say that allowing researchers access is one of the ways it seeks to ensure safe use of its platform, by seeking the help of others to identify harm.

There is nothing to stop Ofcom appointing many researchers. The Bill gives Ofcom the power to delegate its authority and its powers to outside expert researchers to investigate matters on its behalf. In my view, that would be a good thing for Ofcom to do, because it will not have all the expertise in-house. The power to appoint a skilled person to use the powers of Ofcom exists within the Bill, and Ofcom should say that it intends to use that power widely. I would be grateful if the Minister could confirm that Ofcom has that power in the Bill.

Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- View Speech - Hansard - - - Excerpts

It is very kind of you to call me to speak, Mr Deputy Speaker. I apologise to your good self, to the Minister and to the House for arriving rather tardily.

My daughter and her husband have been staying with me over the past few days. When I get up to make my wife and myself an early-morning cup of tea, I find my two grandchildren sitting in the kitchen with their iPads, which does not half bring home the dangers. I look at them and think, “Gosh, I hope there is security, because they are just little kids.” I worry about that kind of thing. As everyone has said, keeping children safe is ever more important.

The Bill’s progress shows some of the best aspects of this place and the other place working together to improve legislation. The shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), and the hon. Member for Aberdeen North (Kirsty Blackman) both mentioned that, and it has been encouraging to see how the Bill has come together. However, as others have said, it has taken a long time and there have been a lot of delays. Perhaps that was unavoidable, but it is regrettable. It has been difficult for the Government to get the Bill to where it is today, and the trouble is that the delays mean there will probably be more victims before the Bill is enacted. We see before us a much-changed Bill, and I thank the Lords for their 150 amendments. They have put in a lot of hard work, as others have said.

The Secretary of State’s powers worry my party and me, and I wonder whether the Bill still fails to tackle harmful activity effectively. Perhaps better things could be done, but we are where we are. I welcome the addition of new offences, such as encouraging self-harm and intimate image abuse. A future Bill might be needed to set out the thresholds for the prosecution of non-fatal self-harm. We may also need further work on the intent requirement for cyber-flashing, and on whether Ofcom can introduce such requirements. I am encouraged by what we have heard from the Minister.

We would also have liked to see more movement on risk assessment, as terms of service should be subject to a mandatory risk assessment. My party remains unconvinced that we have got to grips with the metaverse—this terrifying new thing that has come at us. I think there is work to be done on that, and we will see what happens in the future.

As others have said, education is crucial. I hope that my grandchildren, sitting there with their iPads, have been told as much as possible by their teachers, my daughter and my son-in-law about what to do and what not to do. That leads me on to the huge importance of the parent being able, where necessary, to intervene rapidly, because this has to be done damned quickly. If it looks like they are going down a black hole, we want to stop that right away. A kid could see something horrid that could damage them for life—it could be that bad.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once a child sees something, they cannot unsee it. This is not just about parental controls; we hope that the requirement on the companies to do the risk assessments and on Ofcom to look at those will mean that those issues are stopped before they even get to the point of requiring parental controls. I hope that such an approach will make this safer by design when it begins to operate, rather than relying on having an active parent who is not working three jobs and therefore has time to moderate what their children are doing online.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. Let me just illustrate it by saying that each of us in our childhood, when we were little—when we were four, five or six—saw something that frightened us. Oddly enough, we never forget that throughout the rest of life, do we? That is what bad dreams are made of. We should remember that point, which is why those are wise words indeed.

15:30
Finally, I shall try your excellent patience, Mr Deputy Speaker, with a few words about encryption, to which reference has been made. I commend the Government for their recognition of the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm and must respect the rights to privacy and free expression of those who use social media legally and responsibly. On encryption, for the vast majority, privacy means security. We have always to test that theory, but I think that is what most of us believe. If I picked it up right, the right hon. Member for Haltemprice and Howden (Mr Davis) said that this should be revisited on a regular basis. Perhaps the advisers that Ofcom will hire will address this sort of thing, but this is about constant vigilance, is it not? Let me put it on the record that my party would fundamentally oppose any attempts to undermine or weaken encryption.
Once again, I wish to thank all the Members who have put together a good piece of legislation. In the spirit of generosity, let me say that the Government have tried their very best on a tricky issue, and I give credit to those on both sides of the House for this step in the right direction.
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

This Bill may well have been with us since April 2021 and been subject to significant change, but it remains a Bill about keeping people safer online and it remains groundbreaking. I welcome it back after scrutiny in the Lords and join others in paying tribute to those who have campaigned for social media platforms to release information following the death of a child. I am pleased that some are able to be with us today to hear this debate and the commitment to that issue.

This will never be a perfect Bill, but we must recognise that it is good enough and that we need to get it on to the statute book. The Minister has helped by saying clearly that this is not the endgame and that scrutiny will be inherent in the future of this legislation. I hope that he will heed the comments of my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who encouraged him to set up a bespoke Committee, which was one of the recommendations from the initial scrutiny of the Bill.

I will confine my remarks to the Government’s Lords amendment 263 and those surrounding it, which inserted the amendments I tabled on Report into the Bill. They relate to the sharing of intimate images online, including deepfakes, without consent. I wish wholeheartedly to say thank you to the Minister, who always listens intently, to the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), who has recently joined him, and to the Secretary of State for Science, Innovation and Technology. They have all not only listened to the arguments on intimate image abuse, but acted. The changes today are no less a testament to their commitment to this Bill than any other area. Focusing on children’s safety is very important, but the safety of adults online is also important. We started on a journey to address intimate image abuse way back in 2015, with the Criminal Justice and Courts Act 2015, and we have learned to provide that protection much better, mostly through the work of the Law Commission and its report on how we should be tackling intimate image abuse online.

The Bill, as it has been amended, has been changed fundamentally on the treatment of intimate image abuse, in line with the debate on Report in this place. That has created four new offences. The base offence removes the idea of intent to cause distress entirely and relies only on whether there was consent from the person appearing in the image. Two more serious offences do include intent, with one being sending an image with intent to cause alarm and distress. We also now have the offence of threatening to share an image, which will protect people from potential blackmail, particularly from an abusive partner. That will make a huge difference for victims, who are still overwhelmingly women.

In his closing comments, will the Minister address the gaps that still exist, particularly around the issue of the images themselves, which, because of the scope of the Bill, will not become illegal? He and his colleagues have indicated that more legislation might be in the planning stages to address those particular recommendations by the Law Commission. Perhaps he could also comment on something that the Revenge Porn Helpline is increasingly being told by victims, which is that online platforms will not remove an image even though it may have been posted illegally, and that will not change in the future. Perhaps he can give me and those victims who might be listening today some comfort that either there are ways of addressing that matter now or that he will address it in the very near future.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- View Speech - Hansard - - - Excerpts

As we reflect on the Bill today, it is important to say that it has been improved as it has progressed through the Parliament. That is due in no small measure to Members from across the parties—both here and in the other place—who have engaged very collegiately, and to individuals and groups outside this place, particularly the Samaritans and those who have lived experience of the consequences of the dangers of the internet.

People from my constituency have also been involved, including the family of Joe Nihill, whom I have mentioned previously. At the age of 23, Joe took his own life after accessing dangerous suicide-related online content. His mother, Catherine, and sister-in-law, Melanie, have bravely campaigned to use the Online Safety Bill as an opportunity to ensure that what happened to Joe so tragically does not happen to others. I thank the Minister and his team for meeting Joe’s mother, his sister-in-law and me, and for listening to what we had to say. I recognise that, as a result, the Bill has improved, in particular with the Government’s acceptance of Lords amendment 391, which was first tabled by Baroness Morgan of Cotes. It is welcome that the Government have accepted the amendment, which will enable platforms to be placed in category 1 based on their functionality, even if they do not have a large reach. That is important, because some of the worst and most dangerous online suicide and self-harm related material appears on smaller platforms rather than the larger ones.

I also welcome the fact that the Bill creates a new communications offence of encouraging or assisting self-harm and makes such content a further priority for action, which is important. The Bill provides an historic opportunity to ensure that tackling suicide and self-harm related online content does not end with this Bill becoming law. I urge the Government to listen very carefully to what the Samaritans have said. As my hon. Friend the shadow Minister asked, will the Government commit to a review of the legislation to ensure that it has met the objective of making our country the safest place in the world in which to go online? Importantly, can the Government confirm when the consultation on the new offence of encouraging or assisting self-harm will take place?

As I mentioned in an intervention, it is clear that the Government want to tackle harmful suicide and self-harm related content with the Bill, but, as we have heard throughout our discussions, the measures do not go far enough. The Samaritans were correct to say that the Bill represents a welcome advance and that it has improved recently, but it still does not go far enough in relation to dangerous suicide and self-harm online content. How will the Government engage with people who have lived experience—people such as Melanie and Catherine—to ensure that the new laws make things better? Nobody wants the implementation of the Bill to be the end of the matter. We must redouble our efforts to make the internet as safe a place as possible, reflect on the experiences of my constituents, Joe Nihill and his family, and understand that there is a lot of dangerous suicide and self-harm related content out there. We are talking about people who exploit the vulnerable, regardless of their age.

I urge all those who are following the progress of the Bill and who look at this issue not to make the mistake of thinking that when we talk about dangerous online suicide and self-harm related content, it is somehow about freedom of speech. It is about protecting people. When we talk about dangerous online material relating to suicide and self-harm, it is not a freedom of speech issue; it is an issue of protecting people.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Has the hon. Gentleman noted, I hope with satisfaction, that the Government yesterday and today have made statements on a strategy for preventing suicide nationally, and that what he is saying—which I agree with—will be implemented? It has just been announced, it is very important and it is related to the Bill.

Richard Burgon Portrait Richard Burgon
- Hansard - - - Excerpts

I thank the hon. Gentleman for his intervention. It is important that the Government have announced a strategy: it is part and parcel of the ongoing work that is so necessary when we consider the prevalence of suicide as the leading cause of death among young men and women. It is a scourge across society. People should not make the mistake of thinking that the internet merely showcases awful things. The internet has been used as a tool by exploitative and sometimes disturbed individuals to create more misery and more instances of awful things happening, and to lead others down a dangerous path that sometimes ends, sadly, in them taking their own lives.

I thank the Minister for his engagement with my constituents, and the shadow Minister for what she has done. I also thank Baroness Kidron, Baroness Morgan and hon. Members who have engaged with this issue. I urge the Government to see the Bill not as the end when it comes to tackling dangerous online content related to suicide and self-harm, but as part of an important ongoing journey that we all work on together.

Siobhan Baillie Portrait Siobhan Baillie (Stroud) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak to Lords amendment 231 on visible identity verification. I will not press the amendment to a vote. I have had several discussions with Ministers and the Secretary of State, and I am grateful for their time. I will explain a little more.

The dry nature of the amendment masks the fact that the issue of identity verification—or lack of it—affects millions of people around the country. We increasingly live our lives online, so the public being able to know who is or is not a real person online is a key part of the UK being the safest the place to be on the internet, which is the Bill’s ambition. Unfortunately, too often it feels as though we have to wade through nutters, bots, fake accounts and other nasties before coming to a real person we want to hear from. The Bill takes huge steps to empower users to change that, but there is more to do.

Hon. Members will recall that I have campaigned for years to tackle anonymous abuse. I thank Stroud constituents, celebrities and parents who have brought to me sad stories that I have conveyed to the House involving abuse about the deaths of babies and children and about disabled children. That is absolutely awful.

Alongside a smart Stroud constituent and Clean Up The Internet—a fantastic organisation—we have fought and argued for social media users to have the option of being verified online; for them to be able to follow and be followed only by verified accounts, if that is what they want; and, crucially, to make it clear who is and is not verified online. People can still be Princess Unicorn if they want, but at the back end, their address and details can be held, and that will give confidence.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend is making a powerful case. Umberto Eco, the Italian philosopher, described the internet as the empire of imbeciles, and much of social media is indeed imbecilic—but it is much worse than that. My hon. Friend is right that the internet provides a hiding place for the kind of malevolence she has described. Does she agree that the critical thing is for the Government to look again at the responsibility of those who publish this material? If it were written material, the publisher would have a legal liability. That is not true of internet companies. Is that a way forward?

15:47
Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

I am interested in that intervention, but I fear it would lead us into a very long discussion and I want to keep my comments focused on my amendment. However, it would be interesting to hear from the Minister in response to that point, because it is a huge topic for debate.

On the point about whether someone is real or not real online, I believe passionately that not only famous people or those who can afford it should be able to show that they are a real and verified person. I say, “Roll out the blue ticks.”—or the equivalents—and not just to make the social media performs more money; as we have seen, we need it as a safety mechanism and a personal responsibility mechanism.

All the evidence and endless polling show that the public want to know who is and who is not real online, and it does not take rocket science to understand why. Dealing with faceless, anonymous accounts is very scary and anonymous abusers are terrifying. Parents are worried that they do not know who their children are speaking to, and anonymous, unverified accounts cannot be traced if details are not held.

That is before we get to how visible verification can help to tackle fraud. We should empower people to avoid fake accounts. We know that people are less likely to engage with an unverified account, and it would make it easy to catch scammers. Fraud was the most common form of crime in 2022, with 41% of all crimes being fraud, 23% of all reported fraud being initiated on social media and 80% of fraud being cyber-related. We can imagine just how fantastically clever the scams will become through AI.

Since we started this process, tech companies have recognised the value of identity verification to the public, so much so that they now sell it on Twitter as blue ticks, and the Government understand the benefits of identity verification options. The Government have done a huge amount of work on that. I thank them for agreeing to two of the three pillars of my campaign, and I believe we can get there on visibility; I know from discussions with Government that Ofcom will be looking carefully at that.

Making things simple for social media users is incredibly important. For the user verification provisions in this Bill to fulfil their potential and prevent harm, including illegal harm, we believe that users need to be able to see who is and is not verified—that is, who is a real person—and all the evidence says that that is what the public wants.

While Ministers in this place and the other place have resisted putting visible verification on the face of the Bill, I am grateful to the Government for their work on this. After a lot of to-ing and fro-ing, we are reassured that the Bill as now worded gives Ofcom the powers to do what the public wants and what we are suggesting through codes and guidance. We hope that Ofcom will consider the role of anonymous, inauthentic and non-verified accounts as it prepares its register of risks relating to illegal content and in its risk profiles.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I pay tribute to the way my hon. Friend has focused on this issue through so many months and years. Does she agree that, in light of the assurances that she has had from the Minister, this is just the sort of issue that either a stand-alone committee or some kind of scrutiny group could keep an eye on? If those guidelines do not work as the Minister is hoping, the action she has suggested will need to be taken.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

Absolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.

To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

As others have done, I welcome the considerable progress made on the Bill in the other place, both in the detailed scrutiny that it has received from noble Lords, who have taken a consistent and expert interest in it, and in the positive and consensual tone adopted by Opposition Front Benchers and, crucially, by Ministers.

It seems that there are very few Members of this House who have not had ministerial responsibility for the Bill at some point in what has been an extraordinarily extensive relay race as it has moved through its legislative stages. The anchor leg—the hardest bit in such a Bill—has been run with dedication and skill by my right hon. Friend the Secretary of State, who deserves all the praise that she will get for holding the baton as we cross the parliamentary finish line, as I hope we are close to doing.

I have been an advocate of humility in the way in which we all approach this legislation. It is genuinely difficult and novel territory. In general, I think that my right hon. Friend the Secretary of State and her Ministers—the noble Lord Parkinson and, of course, the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully)—have been willing to change their minds when it was right to do so, and the Bill is better for it. Like others who have dealt with them, I also thank the officials, some of whom sit in the Box, some of whom do not. They have dedicated—as I suspect they would see it—most of their lives to the generation of the Bill, and we are grateful to them for their commitment.

Of course, as others have said, none of this means that the Bill is perfect; frankly, it was never going to be. Nor does it mean that when we pass the Bill, the job is done. We will then pass the baton to Ofcom, which will have a large amount of further work to do. However, we now need to finalise the legislative phase of this work after many years of consideration. For that reason, I welcome in particular what I think are sensible compromises on two significant issues that had yet to be resolved: first, the content of children’s risk assessments, and secondly, the categorisation process. I hope that the House will bear with me while I consider those in detail, which we have not yet done, starting with Lords amendments 17, 20 and 22, and Lords amendment 81 in relation to search, as well as the Government amendments in lieu of them.

Those Lords amendments insert harmful “features, functionalities or behaviours” into the list of matters that should be considered in the children’s risk assessment process and in the meeting of the safety duties, to add to the harms arising from the intrinsic nature of content itself—that is an important change. As others have done, I pay great tribute to the noble Baroness Kidron, who has invariably been the driving force behind so many of the positive enhancements to children’s online safety that the Bill will bring. She has promoted this enhancement, too. As she said, it is right to recognise and reflect in the legislation that a child’s online experience can be harmful not just as a result of the harm an individual piece of content can cause, but in the way that content is selected and presented to that child—in other words, the way in which the service is designed to operate. As she knows, however, I part company with the Lords amendments in the breadth of the language used, particularly the word “behaviours”.

Throughout our consideration of the Bill, I have taken the view that we should be less interested in passing legislation that sounds good and more interested in passing legislation that works. We need the regulator to be able to encourage and enforce improvements in online safety effectively. That means asking the online platforms to address the harms that it is within their power to address, and to relate clearly the design or operation of the systems that they have put in place.

The difficulty with the wording of the Lords amendments is that they bring into the ambit of the legislation behaviours that are not necessarily enabled or created by the design or operation of the service. The language used is

“features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”—

in other words, not limited to those that are enabled or created by the service. It is a step too far to make platforms accountable for all behaviours that are harmful to children without the clarity of that link to what the platform has itself done. For that reason, I cannot support those Lords amendments.

However, the Government have proposed a sensible alternative approach in their amendments in lieu, particularly in relation to Lords amendments 17 and Lords amendment 81, which relates to search services. The Government amendments in lieu capture the central point that design of a service can lead to harm and require a service to assess that as part of the children’s risk assessment process. That is a significant expansion of a service’s responsibilities in the risk assessment process which reflects not just ongoing concern about types of harm that were not adequately captured in the Bill so far but the positive moves we have all sought to make towards safety by design as an important preventive concept in online safety.

I also think it is important, given the potential scale of this expanded responsibility, to make clear that the concept of proportionality applies to a service’s approach to this element of assessment and mitigation of risk, as it does throughout the Bill, and I hope the Minister will be able to do that when he winds up the debate.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

My right hon. and learned Friend has mentioned Ofcom several times. I would like to ask his opinion as to whether there should be, if there is not already, a special provision for a report by Ofcom on its own involvement in these processes during the course of its annual report every year, to be sure that we know that Ofcom is doing its job. In Parliament, we know what Select Committees are doing. The question is, what is Ofcom doing on a continuous basis?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

My hon. Friend makes a fair point. One difficult part of our legislative journey with the Bill is to get right, in so far as we can, the balance between what the regulator should take responsibility for, what Ministers should take responsibility for and what the legislature—this Parliament—should take responsibility for. We may not have got that exactly right yet.

On my hon. Friend’s specific point, my understanding is that because Ofcom must report to Parliament in any event, it will certainly be Ofcom’s intention to report back on this. It will be quite a large slice of what Ofcom does from this point onwards, so it would be remarkable if it did not, but I think we will have to return to the points that my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and others have made about the nature of parliamentary scrutiny that is then required to ensure that we are all on top of this progress as it develops.

I was talking about what I would like my hon. Friend the Minister to say when he winds up the debate. I know he will not have a huge amount of time to do so, but he might also confirm that the balancing duties in relation to freedom of speech and privacy, for example, continue to apply to the fulfilment of the safety duties in this context as well. That would be helpful.

The Government amendments in lieu do not replicate the reference to design in the safety duties themselves, but I do not see that as problematic because, as I understand it, the risks identified in the risk assessment process, which will now include design risks, feed through to and give rise to the safety duties, so that if a design risk is identified in the risk assessment, a service is required to mitigate and address it. Again, I would be grateful if the Minister confirmed that.

We should also recognise that Government amendment (b) in lieu of Lords amendment 17 and Government amendments (b) and (c) in lieu of Lords amendment 81 specifically require consideration of

“functionalities or other features of the service that affect how much children use the service”

As far as I can tell, that introduces consideration of design-related addiction—recognisable to many parents; it cannot just be me—into the assessment process. These changes reflect the reality of how online harm to children manifests itself, and the Government are to be congratulated on including them, although, as I say, the Government and, subsequently, Ofcom will need to be clear about what these new expectations mean in practical terms for a platform considering its risk assessment process and seeking to comply with its safety duties.

I now turn to the amendments dealing with the categorisation process, which are Lords amendment 391 and the Government amendments arising from it. Lords amendment 391 would allow Ofcom to designate a service as a category 1 service, with the additional expectations and responsibility that brings, if it is of a certain scale or if it has certain functionalities, rather than both being required as was the case in the original Bill. The effect of the original drafting was, in essence, that only big platforms could be category 1 platforms and that big platforms were bound to be category 1 platforms. That gave rise to two problems that, as my hon. Friend the Minister knows, we have discussed before.

16:00
The first problem was that smaller platforms where highly harmful material was to be found, whether organically or because it was seeking refuge from the greater regulation of larger platforms, could not be made subject to a more restrictive regime. The second was that larger platforms whose operations give rise to very little concern in the context of this Bill—Wikipedia being a common example—would have to be subject to more extensive regulatory requirements than is justified by the risk they really present. Lords amendment 391 in the name of my right hon. Friend the noble Baroness Morgan seeks to resolve those two problems at once. Given that I proposed an identical amendment in this House, I am unsurprisingly in favour of it, and I congratulate Baroness Morgan on doing a better job of persuading the other place of its merits than I managed to do in this place. I am pleased to see the Government effectively accept that amendment today.
Finally, I will say a few words about the amendments tabled by other right hon. and hon. Members. If you will forgive me, Madam Deputy Speaker, in the interests of time, I will not speak to all the amendments proposed by my hon. Friend the Member for Yeovil (Mr Fysh)—I can see that you approve. However, from what I have just said, he will gather that I cannot support his amendment (a) to Lords amendment 1, which would limit application of all the safety duties in the Bill to
“providers of significant size and capacity, and with a substantial involvement in the communication of media content”.
I cannot support my hon. Friend’s amendment for both technical and substantive reasons. The technical reason is that Lords amendment 1 adds an introductory clause to the Bill that is designed to be a guide to its contents and effects, and his amendment to that clause is not followed through in the rest of the Bill. As such, the introductory clause would say that the Bill’s scope is limited to larger platforms only, but the rest of the Bill would not say the same. The more substantive reason is that in my view, my hon. Friend’s amendment is both inappropriate and unnecessary. It is inappropriate because highly harmful content can be found on smaller platforms, and all platforms should surely do what they can to minimise harm to children and the presence of illegal content on their service, which are the focuses of the Bill. It is unnecessary because the concept of proportionality runs through the Bill, so the regulator’s expectations of small platforms can and should be different from its expectations of large ones.
My hon. Friend’s other amendments seek to avoid introducing, by means of the imposition of the safety duties, what he describes as
“systemic weakness and vulnerabilities relating to compliance with the duties”.
He seeks to do so in a number of places in the Bill. However, that concept of systemic weaknesses and vulnerabilities is not defined and could be extraordinarily wide, potentially undermining the whole purpose of those safety duties. I am being slightly unfair to my hon. Friend, because he has not spoken yet, but I think he is primarily concerned with the Bill’s effect on encrypted services. Others have expressed concern, too—my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and the hon. Member for Brighton, Pavilion (Caroline Lucas) have made their concern known through their amendment to Lords amendment 217—which raises an important question about where we are on encryption. Throughout the progress of the Bill, Ministers have been clear that it involves no ban on the use of encryption. However, as others have said, there will need to be some further clarity—not least, by the way, about the interaction of the regime we are creating with the data protection regime and the involvement of the Information Commissioner’s Office.
Encryption clearly cannot be a “get out of jail free” card for safety duty compliance. Surely, people cannot say, “I operate an encrypted service, so I do not have to comply with the safety duties.” Does it therefore follow that if there is no prohibition on the use of encryption and no exemption from safety duties just because a service uses it, each service that is within the scope of the Bill and uses encryption must show Ofcom that it can meet its safety duties proportionately and with due weight given to balancing duties—particularly on privacy—with the use of encryption? If a service cannot do so, does it follow that Ofcom will require that service to not use encryption, to the extent that that is necessary for it to meet its safety duties to Ofcom’s satisfaction? We need clarity on that point.
Finally, as I said at the start, the Bill is not perfect and there is still much work to be done, but if we can agree the final changes we are discussing and, indeed, if their Lordships are prepared to endorse that next week, the very real prize to be won is that Ofcom can begin the work that it needs to do sooner rather than later and we can bring nearer the benefits that this legislation can deliver for the vulnerable online. More than that, we can enhance the reputation of Parliament as we show that we can do difficult legislation in otherwise fractious times with sincerity, seriousness and a willingness to compromise. I think that is a valuable prize and one within our grasp, and it is why I shall support the Government amendments.
Marcus Fysh Portrait Mr Marcus Fysh (Yeovil) (Con)
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who made a characteristically thoughtful speech. At the outset, I want to put on record my entry in the Register of Members’ Financial Interests, and also my chairmanships of the all-party parliamentary groups on digital identity and on central bank and digital currency, which includes stablecoins. I also put on record the fact that I am the father of an eight-year-old girl and a nine-year-old girl, who have both just got iPads, and I am very aware of the need to protect them as all other children in the UK.

I just want to say that I have had good engagement with Ministers during the progress of this Bill through all of its stages, and I want to thank them and their teams for that. I also want to say that I really welcome what is now in the Bill to progress what I talked about in this place at the last stage it was discussed, which was the effect of algorithms and all of those design features that are part of the addiction we have heard several Members talk about as a potential harm. I think it is really good that that is in the Bill, and it is really good that the tech companies are being forced to think hard about these things.

My amendments—they are proposals for amendments rather than ones I realistically thought we would adopt through votes today—were designed to address a couple of potential shortcomings I saw in the Bill. One was the potential chilling effect on innovation and in the use of really important services that are user-to-user services from a technical point of view, but are not involved in the transmission of the content we are trying to deal with as the main objectives of this Bill. So it was very welcome to hear my hon. Friend the Minister speak at the Dispatch Box about the Government’s intention not to cover the sorts of services to do with data exchange and multi-party computation—some of the more modern methods by which the internet of things, artificial intelligence and various other types of platform run—which are not about making content available that could be a risk in the context we are talking about.

The other shortcoming I was trying to address was this idea, coming back to my right hon. and learned Friend the Member for Kenilworth and Southam, of the potential for the introduction of systemic weaknesses and vulnerabilities into the core systems that all our communications, many of our services, Government services and others rely on day by day for their secure operation. I think he made a very interesting point about the need to think through the precise legal impact that the potential uncertainty about some of those issues might have on the operation of those systems.

I am trying to introduce amendments—for example, amendment (a) in lieu of Lords amendment 189—essentially to provide clarification. This is particularly important when we are thinking about the remote access powers or the remote viewing of information powers in Lords amendment 189, which is why I have proposed an amendment in lieu. It is incredibly important that what we do in this Bill does not create the really fundamental weaknesses that could undermine the security that we and all of our systems rely on for their core operations.

I was also trying to address people’s understandable desire for their data not to be potentially accessible by an unauthorised third party. That type of systemic weakness, which could be introduced by doing the access process in the wrong way, is something we need to think carefully about, and I hope the Minister will say something about intent in respect of that at the Dispatch Box.

I do not want to take too much more time because I know that lots of other Members wish to speak, but the place where I got these ideas, particularly around systemic weakness, were powers in Australian law that are there to provide protection from exactly that type of application of the regulations. I know officials think that Lords amendment 189 does not present such a systemic risk, because it is about viewing information remotely rather than having access to the system directly, but I think that needs more clarity. It actually states:

“view remotely—

information…in real time”

which could potentially be interpreted as requiring that type of access.

On proportionality—this is my last point—we must think about the concept of necessity within that. We must try to strike the right balance—I hope we will all try to do this—between wanting to encourage tech firms to divulge how their systems work, and give people, including the Government, tools to say when something is not working well and they want to opt out of it, while also ensuring that fundamental operative things that are used in cryptography and computer systems to communicate with each other securely, are not inadvertently undermined.

Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- View Speech - Hansard - - - Excerpts

Let me start, like others, by saying how extraordinarily pleased I am to see the Bill return to the House today. I put on record my enormous gratitude to the many people who have worked on it, especially the families of those who have lost loved ones, organisations such as the Internet Watch foundation, of which I have been a champion for over a decade, the Mental Health Foundation, the many Ministers who have worked on this, and especially the Secretary of State, who continued to work on it through her maternity leave, and those in the other place. It was wonderful to be at the Bar of the other place, listening to Baroness Kidron, and others, when they spoke, and I thank her for being here today. I also particularly wish to thank Baroness Morgan and Lord Bethell.

A few months ago at the beginning of the year I went to one of those meetings that all MPs do, when they go and speak to politics students in their own sixth form. They normally throw loads of questions at us, but before I let them throw questions at me, I said, “Listen, I have a question I need to ask.” As a Back Bencher in this place we get asked to work on so many different issues, so I grabbed the white board and scribbled down a list of many issues that I have been asked to work on, both domestically and internationally. I gave the students each three votes and asked them what they wanted my priority to be. The issue of tackling online pornography, and the impact it was having, was way up that list.

I thank the Children’s Commissioner for the work done with young people to identify and understand that risk more. Our research asked 16 to 21-year-olds when they had first seen online pornography, and 10%—one in 10—had seen online pornography by the age of nine, and 27% had seen it by the age of 11, so more than one in four. Fifty per cent.—that is half; that is every other one of those young people—had seen online pornography before they turned 13.

It is also the case that the type of pornography they have been seeing is increasingly more violent in nature, and that is changing young people’s attitude towards sex. Young people aged 16 to 21 are more likely to assume that girls expect or enjoy sex involving physical aggression such as airway restriction—strangling—or slapping, than those who do not see such pornography. Among the respondents, 47% stated that girls expect sex to involve physical aggression, and 42% said that most girls enjoy acts of sexual aggression. Some 47% of respondents aged 18 to 21 had experienced a violent sexual act. The Children’s Commissioner also asked these young people where they were watching pornography, and the greatest number of young people were watching that pornography on Twitter—now X—not pornography platforms.

15:25
This is not just an issue in this country. I took a delegation with the Inter-Parliamentary Union to the UN Commission on the Status of Women, where we held a joint meeting with the Women and Equalities Committee, and it was standing room only, with women from hugely diverse parts of the world, including South Korea, India, Canada, New Zealand and many different European countries. I said that in the UK we were seeing younger and younger children viewing online pornography that is increasingly violent in content, leading to more violence in relationships and more sexual abuse. I asked them which of them were seeing that in their own country, and every single hand in that room—standing room only—went up. They had come to that room because they knew that the UK was going to legislate in this area and they wanted to see what we did.
By passing the amendments, working with the House of Lords, we will ensure that we have age assurance to stop young people being able to see pornography, and not just on pornography sites but on social media sites. We are taking massive steps to safeguard our children and young people, and the rest of the world will follow. Thank you for everything that has been done.
I also want to talk about self-harm and, in particular, eating disorders. Madam Deputy Speaker, you will remember the last time I spoke about this matter, and I speak as a former anorexic. Anorexia is the largest killer of all mental health conditions. Last week, I met mental health experts in my constituency, and they were talking about the increases we have seen recently in acute mental health issues, especially in people considering suicide and in people with eating disorders. They completely agreed, from what they are seeing on the ground, that online content encouraging or glamorising self-harm is part of what is fuelling this rise. That is why the Mental Health Foundation, Beat and other charities have worked so hard, and I thank them for their advice and work. They have long called for better regulation of dangerous suicide and eating disorder forums.
I am absolutely delighted that the Government have accepted and strengthened the amendment from Baroness Morgan of Cotes, because dangerous platforms are not just large platforms. I heard from the group I met last week about a tiny forum that is setting young women their death dates. Two young people had already killed themselves on their death date, as set by this platform, before the mental health experts had found out about it. They have been able to rescue at least two others by knowing about it. Small platforms can be really dangerous. The amendment will enable smaller platforms to be regulated in the same way as major platforms, such as Facebook.
The Mental Health Foundation said:
“We are delighted that the Government has accepted the amendment… This will make it harder for people to stumble upon the worst content and help protect their mental health. The Government is to be congratulated for this important change… We also thank all parliamentarians from both Houses and from all parties who have supported this change.”
I listened to my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), and I absolutely agree with how he set out his support for this measure. It is why I am afraid we cannot support the amendments from my hon. Friend the Member for Yeovil (Mr Fysh) in this area.
In particular, I want to make sure that the new criminal offence of intentionally encouraging people to self-harm covers eating disorders. I was grateful to the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), for writing to me earlier this year on 17 May, confirming that the definition of serious self-harm, when it comes to this offence, will cover, for example, encouraging someone not to eat, not to drink or not to take prescribed medication. He confirmed that those provisions were included with eating disorders in mind. Actually, when we delve deeper into this, we see that it is sometimes not the individual bit of content but the way in which a vulnerable person gets bombarded with content on those platforms that can be so damaging.
Last December, the Center for Countering Digital Hate did some research into that and found that TikTok was bombarding vulnerable users with harmful content every 39 seconds. That is how the algorithm is affecting the issue. I therefore wrote back to the Minister and asked him whether the offence would cover the algorithm as well as the content. I will try to be quick, Madam Deputy Speaker, but I want to put on the record exactly what he said, because I think it is important. He said that
“the offence cannot apply to algorithms themselves. Algorithms are designed to automatically send people material that may be of interest to them. It seems unlikely that if a person merely creates an algorithm and does not themselves send, transmit or publish the communication (for example), they could be said to be undertaking a ‘relevant act’. However, every case will turn on its specific facts, and if the circumstances are such that a person’s action does constitute ‘a relevant act capable of encouraging or assisting the serious self-harm of another person’ and that act is intended to encourage or assist the serious self-harm of another person, then the creator of the algorithm will be captured.”
I wanted to read that out in this place because it is really important that creators of algorithms are aware that there is a risk that if they continue with this behaviour, which is bombarding our young people with this most dangerous content, they could be caught under that offence. Will the Minister, in his closing remarks, kindly confirm from the Dispatch Box that that is exactly what the Minister of State for Justice put in writing to me?
None Portrait Several hon. Members rose—
- Hansard -

Baroness Winterton of Doncaster Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I have three more speakers. I ask that colleagues bear that in mind so that I can bring in the Minister.

William Cash Portrait Sir William Cash
- View Speech - Hansard - - - Excerpts

I would like to mention a very long journey in relation to the protection of children, because to my mind that is right at the heart of the Bill’s social value. I think it was Disraeli who said:

“The youth of a nation are the trustees of posterity.”

If we get it right in the early stages of their lives and we provide legislation that enables them to be properly protected, we are likely to get things right for the future. The Bill does that in a very good way.

The Bill also reflects some of the things in which I found myself involved in 1977—just over 45 years ago—with the Protection of Children Bill when Cyril Townsend came top of the private Member’s Bill ballot. I mention that because at that time we received resistance from Government Ministers and others—I am afraid I must say that it was a Labour Minister—but we got the Bill through as the then Prime Minister James Callaghan eventually ensured it did so. His wife insisted on it, as a matter of fact.

I pay tribute to the House of Lords. Others have repeatedly mentioned the work of Baroness Kidron, but I would also like to mention Lord Bethell, Baroness Morgan and others, because it has been a combined effort. It has been Parliament at its best. I have heard others, including my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), make that point. It has been a remarkably lengthy but none the less essential process, and I pay tribute to those people for what they have done.

In retrospect, I would like to mention Baroness Lucy Faithfull, because back in 1977-78 I would not have known what to do if she had not worked relentlessly in the House of Lords to secure the measures necessary to protect children from sexual images and pornographic photography—it was about assault, and I do not need to go into the detail. The bottom line is that it was the first piece of legislation that swung the pendulum towards common sense and proportionality in matters that, 45 years later, have culminated in what has been discussed in the Bill and the amendments today.

I pay tribute to Ian Russell and to the others here whose children have been caught up in this terrible business. I pay specific tribute to the Secretary of State and the Minister, and also the Health Secretary for his statement yesterday about a national suicide strategy, in which he referenced amendments to the Bill. Because I have had a lot to do with him, I would like to pay tribute to Richard Collard of the National Society for the Prevention of Cruelty to Children, who has not been mentioned yet, for working so hard and effectively.

I pay tribute to my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for her work to help get the amendments through. The written ministerial statement came after some interesting discussions with the Minister, who was a bit surprised by our vehemence and determination. It was not chariots of fire but chariots on fire, and within three weeks, by the time the Bill got to the House of Lords, we had a written ministerial statement that set the tone for the part of the Bill that I discussed just now, to protect children because they need protection at the right time in their lives.

The NSPCC tells us that 86% of UK adults want companies to understand how groomers and child abusers use their sites to harm children, and want action to prevent it by law. I came up with the idea, although the right hon. Member for Barking (Dame Margaret Hodge) gave us a lot of support in a debate in this House at the time, and I am grateful to her for that. The fact that we are able to come forward with this legislation owes a great deal to a lot of people from different parts of the House.

I very much accept that continuing review is necessary. Many ideas have been put forward in this debate, and I am sure that the Minister is taking them all on board and will ensure that the review happens and that Ofcom acts accordingly, which I am sure it will want to. It is important that that is done.

I must mention that the fact we have left the European Union has enabled us to produce legislation to protect children that is very significantly stronger than European Union legislation. The Digital Services Act falls very far short of what we are doing here. I pay tribute to the Government for promoting ideas based on our self-government to protect our voters’ children and our society. That step could only have been taken now that we have left the European Union.

Research by the NSPCC demonstrates that four in five victims of online grooming offences are girls. It is worth mentioning that, because it is a significant piece of research. That means that there has to be clear guidance about the types of design that will be incorporated by virtue of the discussions to be had about how to make all this legislation work properly.

The only other thing I would like to say is that the £10-million suicide prevention grant fund announced yesterday complements the Bill very well. It is important that we have a degree of symmetry between legislation to prevent suicide and to ensure that children are kept safe.

16:30
There is much more I could say but I do not need to say any more, except to say thank you to everybody in this House and in the other place, and to officials for the advice we have received from the Department and for the co-operation we have had. I believe that this will be a groundbreaking Bill when it is applied in practice. It is not enough just to pass pieces of legislation; the question is how we manage to implement them. That, to my mind, is the most important thing. I thank everybody concerned for the work they have done to make sure the Bill will eventually reach the statute book.
Miriam Cates Portrait Miriam Cates (Penistone and Stocksbridge) (Con)
- View Speech - Hansard - - - Excerpts

I will follow on from the remarks made by my right hon. Friend the Member for Chelmsford (Vicky Ford), who talked powerfully about the impact of online pornography, particularly on children who see it.

Sadly, online pornography is increasingly violent. Many videos depict graphic and degrading abuse of women, sickening acts of rape and incest, and many underage participants. I also want to refer to the excellent study by the Children’s Commissioner, which revealed that the average age at which children first encounter pornography online is just 13 years old, and that there are 1.4 million visits to pornography sites by British children each and every month. As my right hon. Friend said, that is rewiring children’s brains in respect of what they think about sex, what they expect during sex and what they think girls want during sex. I think we will all look back on this widespread child exposure to pornography in a similar way to how we look back on children working down mines or being condemned to the poor house. Future generations will wonder how on earth we abandoned our children to online pornography.

Ending the ready availability of pornographic content to children and criminalising those who fail to protect them should surely be the most important goal of the Online Safety Bill. Indeed, that was most of the aim of part 3 of the Digital Economy Act 2017, which was never enacted. Without the Government amendments tabled in the Lords last week, which I strongly support, the Online Safety Bill would have been in danger of missing this opportunity. As my colleagues have done, I want to thank the Secretary of State and Ministers for their engagement in what has been a cross-party campaign both in this place and the other place, with Baroness Kidron and Lord Bethell leading the way, along with charities and the campaigning journalist Charles Hymas at The Daily Telegraph, who did a fantastic job of reporting it all so powerfully. I also thank my hon. Friend the Member for Stone (Sir William Cash), who has taught me all I ever needed to know about how to negotiate with Government.

We now have these brilliantly strengthening amendments, including, significantly, an amendment that will criminalise directors and managers if they do not comply with Ofcom’s enforcement notices in relation to specific child safety duties. That is really important, because we are talking about the wealthiest companies in the world. Just having fines will not be enough to generate the kind of culture change at board level that we need. Only potential jail terms, which have worked in the construction industry and the financial services industry, will do what it takes.

Lords amendments 141 and 142 make pornography a primary priority harm for children. Importantly, user-to-user providers, as well as dedicated adult sites, will now be explicitly required to use highly effective age verification tools to prevent children accessing them. The wording “highly effective” is crucial, because porn is porn wherever it is found, whether on Twitter, which as my right hon. Friend the Member for Chelmsford said is the most likely place for children to find pornography, or on dedicated adult sites. It has the same effect and causes the same harm. It is therefore vital that tech companies will actually have to prevent children from going on their sites, and not just try hard. That is an incredibly important amendment.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Does my hon. Friend agree that what has really put their teeth on edge most of all is the idea that they might go to prison?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

My hon. Friend is completely right. The impact of not taking responsibility for protecting children has to go to the very top.

Lords amendment 105 would compel Ofcom to submit its draft codes of practice within 18 months. That is an improvement on the previously lax timescale, which I welcome—along with the other significant improvements that have been made—and I repeat my gratitude to the Minister and the Secretary of State. Let us not pretend, however, that on Royal Assent our children will suddenly be safe from online pornography or any other online harms. There are serious questions to be asked about Ofcom’s capabilities to enforce against non-compliant porn sites, and I think we should look again at part 3 of the Digital Economy Act 2017, which would have allowed the British Board of Film Classification to act as the regulator.

Ian Paisley Portrait Ian Paisley (North Antrim) (DUP)
- Hansard - - - Excerpts

I congratulate the hon. Lady on the excellent efforts she has made over a long period to highlight these matters. Does she agree that this is not the end but only the beginning of the first days of ensuring that we have proper digital access protection for not only children but adults who have access to digital devices?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I thank the hon. Gentleman for his support. What he says is entirely correct.

The key to this does, of course, lie in the implementation. One of the capabilities of the BBFC is to disrupt the business model and the payment provision of the adult online industry. I ask the Minister to consider whether he can direct Ofcom to examine the way in which the BBFC deals with offline and streamed pornography, and whether Ofcom could learn some lessons from that. There is still a disparity between the kind of pornography that is allowed offline, on DVD or streamed services, and the kind that appears online. Offline, certain acts are illegal and the BBFC will not classify the content: any act that looks non-consensual, for example, is illegal and the material cannot be distributed, whereas online it proliferates.

The Bill should have been the perfect vehicle to expand those rules to all online services offering pornographic content. Sadly, we have missed that opportunity, but I nevertheless welcome the Government’s recently announced porn review. I hope it can be used to close the online/offline gap, to insert verification checks for people appearing in pornographic videos and to deal with related offences. Many of those people did not consent and do not know that they are in the videos.

We also need to take account of the complete lack of moderation on some of the sites. It was recently revealed in a court case in the United States that 700,000 Pornhub sites had been flagged for illegal content, but had not been checked. Pornhub has managed to check just 50 videos a day, and has acknowledged that unless a video has been flagged more than 15 times for potential criminal content, such as child rape, it will not even join the queue to be moderated and potentially taken down. The children and the trafficked women who appear in those videos are seeing their abuse repeated millions of times with no ability to pull it down.

The Bill has been controversial, and many of the arguments have concerned issues of free speech. I am a supporter of free speech, but violent pornography is not free speech. Drawing children into addiction is not free speech. Knowingly allowing children to view horrific sex crimes is not free speech. Publishing and profiting from videos of children being raped is not free speech. It is sickening, it is evil, it is destructive and it is a crime, and it is a crime from which too many profit with impunity. A third of the internet consists of pornography. The global porn industry’s revenue is estimated to be as much as $97 billion. The Bill is an important step forward, but we would be naive to expect this Goliath of an industry to roll over and keep children safe. There is much more to be done which will require international co-operation, co-operation from financial institutions, and Governments who are prepared to stand their ground against the might of these vested interests. I very much hope that this one will.

Anna Firth Portrait Anna Firth (Southend West) (Con)
- View Speech - Hansard - - - Excerpts

I want to speak briefly about Lords amendments 195 and 153, which would allow Ofcom, coroners and bereaved parents to acquire information and support relating to a child’s use of social media in the event of that child’s tragic death. Specifically, I want to speak about Archie Battersbee, who lived in my constituency but lost his life tragically last year, aged only 12. Archie’s mum, Hollie, was in the Public Gallery at the beginning of the debate, and I hope that she is still present. Hollie found Archie unconscious on the stairs with a ligature around his neck. The brain injury Archie suffered put him into a four-month coma from which, sadly, doctors were unable to save him.

To this day, Hollie believes that Archie may have been taking part in some form of highly dangerous online challenge, but, unable to access Archie’s online data beyond 90 days of his search history, she has been unable to put this devastating question to rest. Like the parents of Molly, Breck, Isaac, Frankie and Sophia, for the last year Hollie has been engaged in a cruel uphill struggle against faceless corporations in her attempt to determine whether her child’s engagement with a digital service contributed to his death. Despite knowing that Archie viewed seven minutes of content and received online messages in the hour and a half prior to his death, she has no way of knowing what may have been said or exactly what he may have viewed, and the question of his online engagement and its potential role in his death remains unsolved.

Lords amendment 195, which will bolster Ofcom’s information-gathering powers, will I hope require a much more humane response from providers in such tragic cases as this. This is vital and much-needed legislation. Had it been in place a year ago, it is highly likely that Hollie could have laid her concerns to rest and perhaps received a pocket of peace in what has been the most traumatic time any parent could possibly imagine.

I also welcome Lords amendment 153, which will mandate the largest providers to put in place a dedicated helpline so that parents who suffer these tragic events will have a direct line and a better way of communicating with social media providers, but the proof of the pudding will obviously be in the eating. I very much hope that social media providers will man that helpline with real people who have the appropriate experience to deal with parents at that tragic time in their lives. I believe that Hollie and the parents of many other children in similar tragic cases will welcome the Government’s amendments that allow Ofcom, coroners and bereaved parents to access their children’s online data via the coroner directing Ofcom.

I pay tribute to the noble Baroness Kidron, to my right hon. Friend the Member for Bromsgrove (Sajid Javid) and to the Bereaved Families for Online Safety group, who have done so much fantastic work in sharing their heartrending stories and opening our eyes to what has been necessary to improve the Online Safety Bill. I also, of course, pay tribute to Ian Russell, to Hollie and to all the other bereaved parents for their dedication to raising awareness of this hugely important issue.

If I could just say one last thing, I have been slipped from the Education Committee to attend this debate today and I would like to give an advert for the Committee’s new inquiry, which was launched on Monday, into the effects of screen time on education and wellbeing. This Bill is not the end of the matter—in many ways it is just the beginning—and I urge all Members please to engage with this incredibly important inquiry by the Education Committee.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

I thank all right hon. and hon. Members for their contribution to the debate today and, indeed, right through the passage of this complex Bill.

First, let me turn to the amendments tabled by my hon. Friend the Member for Yeovil (Mr Fysh). I understand that the intention of his amendments is to restrict the reach of the new online safety regulatory regime in a number of ways. I appreciate his concern to avoid unnecessarily burdensome business, and I am sympathetic to his point that the Bill should not inhibit sectors such as the life sciences sector. I reassure him that such sectors are not the target of this regime and that the new regulatory framework is proportionate, risk-based and pro-innovation.

The framework has been designed to capture a range of services where there is a risk of significant harm to users, and the built-in exemptions and categorisations will ensure it is properly targeted. The alternative would be a narrow scope, which would be more likely to inadvertently exempt risky science or to displace harm on to services that are out of scope. The extensive discussion on this point in both Houses has made it clear that such a position is unlikely to be acceptable.

The amendments to the overarching statement that would change the services in scope would introduce unclear and subjective terms, causing issues of interpretation. The Bill is designed so that low-risk services will have to put in place only proportionate measures that reflect the risk of harm to their users and the service provider’s size and capacity, ensuring that small providers will not be overly burdened unless the level of risk requires it.

The amendment that would ensure Ofcom cannot require the use of a proactive technology that introduces weaknesses or vulnerabilities into a provider’s systems duplicates existing safeguards. It also introduces vague terms that could restrict Ofcom’s ability to require platforms to use the most effective measures to address abhorrent illegal activity.

Ofcom must act proportionately, and it must consider whether a less intrusive measure could achieve the same effect before requiring the use of proactive technology. Ofcom also has duties to protect both privacy and private property, including algorithms and code, under the Human Rights Act 1998.

16:45
Ian Paisley Portrait Ian Paisley
- Hansard - - - Excerpts

I thank the Minister for engaging with us on access to private property and for setting up, with his officials, a consultation on the right to access a person’s phone after they are deceased or incapacitated. I thank him for incorporating some of those thoughts in what he and the Government are doing today. I hope this is the start of something and that these big digital companies will no longer be able to bully people. The boot will be on the other foot, and the public will own what they have on their digital devices.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.

The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.

The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.

My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.

On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.

My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.

Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.

The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.

The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.

Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think I need to respond to that, but it goes to show does it not?

My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.

The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.

My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

It has just crossed my mind that the Minister might be saying that he agreed with everything that I said, which cannot be right. Let me be clear about the two points. One was in relation to whether, when we look at design harms, both proportionality and balancing duties are relevant—I think that he is saying yes to both. The other point that I raised with him was around encryption, and whether I put it in the right way in terms of the Government’s position on encryption. If he cannot deal with that now, and I would understand if he cannot, will he write to me and set out whether that is the correct way to see it?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. Indeed, end-to-end encrypted services are in the scope of the Bill. Companies must assess the level of risk and meet their duties no matter what their design is.

Vicky Ford Portrait Vicky Ford
- Hansard - - - Excerpts

Can the Minister confirm whether the letter I received from the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is accurate?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was just coming to that. I thank my right hon. Friend for the rest of her speech. She always speaks so powerfully on eating disorders—on anorexia in particular—and I can indeed confirm the intent behind the Minister’s letter about the creation and use of algorithms.

Finally, I shall cover two more points. My hon. Friend the Member for Stone (Sir William Cash) always speaks eloquently about this. He talked about Brexit, but I will not get into the politics of that. Suffice to say, it has allowed us—as in other areas of digital and technology—to be flexible and not prescriptive, as we have seen in measures that the EU has introduced.

I also ask my hon. Friend the Member for Southend West (Anna Firth) to pass on my thanks and best wishes to Hollie whom I met to talk about Archie Battersbee.

17:00
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On the small high-harm platforms that are now in the scope of the Bill, will the Minister join me in thanking Hope Not Hate, the Antisemitism Policy Trust and CST, which have campaigned heavily on this point? While we have been having this debate, the CST has exposed BitChute, one of those small high-harm platforms, for geoblocking some of the hate to comply with legislation but then advertising loopholes and ways to get around that on the platform. Can the Minister confirm that the regulator will be able to take action against such proceedings?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will certainly look at that. Our intention is that in all areas, especially relating to children and their protection, that might not fall within the user enforcement duties, we will look to make sure that the work of those organisations is reflected in what we are trying to achieve in the Bill.

We have talked about the various Ministers that have looked after the Bill during its passage, and the Secretary of State was left literally holding the baby in every sense of the word because she continued to work on it while she was on maternity leave. We can see the results of that with the engagement that we have had. I urge all Members on both sides of the House to consider carefully the amendments I have proposed today in lieu of those made in the Lords. I know every Member looks forward eagerly to a future in which parents have surety about the safety of their children online. That future is fast approaching.

I reiterate my thanks to esteemed colleagues who have engaged so passionately with the Bill. It is due to their collaborative spirit that I stand today with amendments that we believe are effective, proportionate and agreeable to all. I hope all Members will feel able to support our position.

Amendment (a) made to Lords amendment 182.

Lords amendment 182, as amended, agreed to.

Amendments (a) and (b) made to Lords amendment 349.

Lords amendment 349, as amended, agreed to.

Amendment (a) made to Lords amendment 391.

Lords amendment 391, as amended, agreed to.

Government consequential amendment (a) made.

Lords amendment 17 disagreed to.

Government amendments (a) and (b) made in lieu of Lords amendment 17.

Lords amendment 20 disagreed to.

Lords amendment 22 disagreed to.

Lords amendment 81 disagreed to.

Government amendments (a) to (c) made in lieu of Lords amendment 81.

Lords amendment 148 disagreed to.

Government amendment (a) made in lieu of Lords amendment 148.

Lords amendments 1 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181, 183 to 348, 350 to 390, and 392 to 424 agreed to, with Commons financial privileges waived in respect of Lords amendments 171, 180, 181, 317, 390 and 400.

Ordered, That a Committee be appointed to draw up Reasons to be assigned to the Lords for disagreeing to their amendments 20 and 22;

That Paul Scully, Steve Double, Alexander Stafford, Paul Howell, Alex Davies-Jones, Taiwo Owatemi and Kirsty Blackman be members of the Committee;

That Paul Scully be the Chair of the Committee;

That three be the quorum of the Committee.

That the Committee do withdraw immediately.—(Mike Wood.)

Committee to withdraw immediately; reasons to be reported and communicated to the Lords.