Online Safety Bill Debate
Full Debate: Read Full DebateDamian Collins
Main Page: Damian Collins (Conservative - Folkestone and Hythe)Department Debates - View all Damian Collins's debates with the Department for Science, Innovation & Technology
(1 year, 3 months ago)
Commons ChamberI have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.
I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.
There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.
There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.
Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.
My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.
Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.
We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.
We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.
The draft Bill was published in April 2021, so it is fantastic that we are now discussing its final stages after it has gone through its processes in the House of Lords. It went through pre-legislative scrutiny, then it was introduced here, committed to the Bill Committee, recommitted, came back to the House, went to the Lords and came back again. I do not think any Bill has had as much scrutiny and debate over such a long period of time as this one has had. Hon. Members have disagreed on it from time to time, but the spirit and motivation at every stage have never been political; it has been about trying to make the Bill the best it can possibly be. We have ended up with a process that has seen it get better through all its stages.
Picking up on the comments of the hon. Member for Aberdeen North (Kirsty Blackman) and others, the question of ongoing scrutiny of the regime is an important one. In the pre-legislative scrutiny Committee—the Joint Committee that I chaired—there was a recommendation that there should be a post-legislative scrutiny Committee or a new Joint Committee, perhaps for a limited period. The pre-legislative scrutiny Committee benefited enormously from being a Committee of both Houses. Baroness Kidron has rightly been mentioned by Members today and she is watching us today from the Gallery. She is keeping her scrutiny of the passage of the Bill going from her position of advantage in the Gallery.
We have discussed a number of new technologies during the Bill’s passage that were not discussed at all on Second Reading because they were not live, including the metaverse and large language models. We are reassured that the Bill is futureproof, but we will not know until we come across such things. Ongoing scrutiny of the regime, the codes of practice and Ofcom’s risk registers is more than any one Select Committee can do. The Government have previously spoken favourably of the idea of post-legislative scrutiny, and it would be good if the Minister could say whether that is still under consideration.
My hon. Friend makes a powerful point, echoing the comments of Members on both sides of the House. He is absolutely right that, as well as the scale and character of internet harms, their dynamism is a feature that Governments must take seriously. The problem, it seems to me, is that the pace of technological change, in this area and in others, does not fit easily with the thoroughness of the democratic legislative process; we tend to want to do things at length, because we want to scrutinise them properly, and that takes time. How does my hon. Friend square that in his own mind, and what would he recommend to the Government?
The length of the process we have gone through on this Bill is a good thing, because we have ended up with probably the most comprehensive legislation in the world. We have a regulator with more power, and more power to sanction, than anywhere else. It is important to get that right.
A lot of the regulation is principle-based. It is about the regulation of user-to-user services, whereby people share things with each other through an intermediary service. Technology will develop, but those principles will underpin a lot of it. There will be specific cases where we need to think about whether the regulatory oversight works in a metaverse environment in which we are dealing with harms created by speech that has no footprint. How do we monitor and scrutinise that?
One of the hardest challenges could be making sure that companies continue to use appropriate technology to identify and mitigate harms on their platforms. The problem we have had with the regime to date is that we have relied on self-reporting from the technology companies on what is or is not possible. Indeed, the debate about end-to-end encryption is another example. The companies are saying that, if they share too much data, there is a danger that it will break encryption, but they will not say what data they gather or how they use it. For example, they will not say how they identify illegal use of their platform. Can they see the messages that people have sent after they have sent them? They will not publicly acknowledge it, and they will not say what data they gather and what triggers they could use to intervene, but the regulator will now have the right to see them. That principle of accountability and the power of the regulator to scrutinise are the two things that make me confident that this will work, but we may need to make amendments because of new things that we have not yet thought about.
In addition to the idea of annual scrutiny raised by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), does my hon. Friend think it would be a reasonably good idea for the Select Committee on Culture, Media and Sport to set up a Sub-Committee under its Standing Orders to keep any eye on this stuff? My hon. Friend was a great Chairman of that Select Committee, and such a Sub-Committee would allow the annual monitoring of all the things that could go wrong, and it could also try to keep up with the pace of change.
When I chaired the Digital, Culture, Media and Sport Committee, we set up a Sub-Committee to consider these issues and internet regulation. Of course, the Sub-Committee has the same members. It is up to the Select Committee to determine how it structures itself and spends its time, but there is only so much that any one departmental Select Committee can do among its huge range of other responsibilities. It might be worth thinking about a special Committee, drawing on the powers and knowledge of both Houses, but that is not a matter for the Bill. As my hon. Friend knows, it is a matter of amending the Standing Orders of the House, and the House must decide that it wants to create such a Committee. I think it is something we should consider.
We must make sure that encrypted services have proper transparency and accountability, and we must bring in skilled experts. Members have talked about researcher access to the companies’ data and information, and it cannot be a free-for-all; there has to be a process by which a researcher applies to get privileged access to a company’s information. Indeed, as part of responding to Ofcom’s risk registers, a company could say that allowing researchers access is one of the ways it seeks to ensure safe use of its platform, by seeking the help of others to identify harm.
There is nothing to stop Ofcom appointing many researchers. The Bill gives Ofcom the power to delegate its authority and its powers to outside expert researchers to investigate matters on its behalf. In my view, that would be a good thing for Ofcom to do, because it will not have all the expertise in-house. The power to appoint a skilled person to use the powers of Ofcom exists within the Bill, and Ofcom should say that it intends to use that power widely. I would be grateful if the Minister could confirm that Ofcom has that power in the Bill.
The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.
The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.
Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.
My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.
The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.
My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.
On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.
My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.
Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.
The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.
The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.
Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.
My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.
As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.
I do not think I need to respond to that, but it goes to show does it not?
My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.
My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.
The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.
My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.