Online Safety Bill Debate
Full Debate: Read Full DebateJeremy Wright
Main Page: Jeremy Wright (Conservative - Kenilworth and Southam)Department Debates - View all Jeremy Wright's debates with the Department for Science, Innovation & Technology
(1 year, 3 months ago)
Commons ChamberAbsolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.
To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.
As others have done, I welcome the considerable progress made on the Bill in the other place, both in the detailed scrutiny that it has received from noble Lords, who have taken a consistent and expert interest in it, and in the positive and consensual tone adopted by Opposition Front Benchers and, crucially, by Ministers.
It seems that there are very few Members of this House who have not had ministerial responsibility for the Bill at some point in what has been an extraordinarily extensive relay race as it has moved through its legislative stages. The anchor leg—the hardest bit in such a Bill—has been run with dedication and skill by my right hon. Friend the Secretary of State, who deserves all the praise that she will get for holding the baton as we cross the parliamentary finish line, as I hope we are close to doing.
I have been an advocate of humility in the way in which we all approach this legislation. It is genuinely difficult and novel territory. In general, I think that my right hon. Friend the Secretary of State and her Ministers—the noble Lord Parkinson and, of course, the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully)—have been willing to change their minds when it was right to do so, and the Bill is better for it. Like others who have dealt with them, I also thank the officials, some of whom sit in the Box, some of whom do not. They have dedicated—as I suspect they would see it—most of their lives to the generation of the Bill, and we are grateful to them for their commitment.
Of course, as others have said, none of this means that the Bill is perfect; frankly, it was never going to be. Nor does it mean that when we pass the Bill, the job is done. We will then pass the baton to Ofcom, which will have a large amount of further work to do. However, we now need to finalise the legislative phase of this work after many years of consideration. For that reason, I welcome in particular what I think are sensible compromises on two significant issues that had yet to be resolved: first, the content of children’s risk assessments, and secondly, the categorisation process. I hope that the House will bear with me while I consider those in detail, which we have not yet done, starting with Lords amendments 17, 20 and 22, and Lords amendment 81 in relation to search, as well as the Government amendments in lieu of them.
Those Lords amendments insert harmful “features, functionalities or behaviours” into the list of matters that should be considered in the children’s risk assessment process and in the meeting of the safety duties, to add to the harms arising from the intrinsic nature of content itself—that is an important change. As others have done, I pay great tribute to the noble Baroness Kidron, who has invariably been the driving force behind so many of the positive enhancements to children’s online safety that the Bill will bring. She has promoted this enhancement, too. As she said, it is right to recognise and reflect in the legislation that a child’s online experience can be harmful not just as a result of the harm an individual piece of content can cause, but in the way that content is selected and presented to that child—in other words, the way in which the service is designed to operate. As she knows, however, I part company with the Lords amendments in the breadth of the language used, particularly the word “behaviours”.
Throughout our consideration of the Bill, I have taken the view that we should be less interested in passing legislation that sounds good and more interested in passing legislation that works. We need the regulator to be able to encourage and enforce improvements in online safety effectively. That means asking the online platforms to address the harms that it is within their power to address, and to relate clearly the design or operation of the systems that they have put in place.
The difficulty with the wording of the Lords amendments is that they bring into the ambit of the legislation behaviours that are not necessarily enabled or created by the design or operation of the service. The language used is
“features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”—
in other words, not limited to those that are enabled or created by the service. It is a step too far to make platforms accountable for all behaviours that are harmful to children without the clarity of that link to what the platform has itself done. For that reason, I cannot support those Lords amendments.
However, the Government have proposed a sensible alternative approach in their amendments in lieu, particularly in relation to Lords amendments 17 and Lords amendment 81, which relates to search services. The Government amendments in lieu capture the central point that design of a service can lead to harm and require a service to assess that as part of the children’s risk assessment process. That is a significant expansion of a service’s responsibilities in the risk assessment process which reflects not just ongoing concern about types of harm that were not adequately captured in the Bill so far but the positive moves we have all sought to make towards safety by design as an important preventive concept in online safety.
I also think it is important, given the potential scale of this expanded responsibility, to make clear that the concept of proportionality applies to a service’s approach to this element of assessment and mitigation of risk, as it does throughout the Bill, and I hope the Minister will be able to do that when he winds up the debate.
My right hon. and learned Friend has mentioned Ofcom several times. I would like to ask his opinion as to whether there should be, if there is not already, a special provision for a report by Ofcom on its own involvement in these processes during the course of its annual report every year, to be sure that we know that Ofcom is doing its job. In Parliament, we know what Select Committees are doing. The question is, what is Ofcom doing on a continuous basis?
My hon. Friend makes a fair point. One difficult part of our legislative journey with the Bill is to get right, in so far as we can, the balance between what the regulator should take responsibility for, what Ministers should take responsibility for and what the legislature—this Parliament—should take responsibility for. We may not have got that exactly right yet.
On my hon. Friend’s specific point, my understanding is that because Ofcom must report to Parliament in any event, it will certainly be Ofcom’s intention to report back on this. It will be quite a large slice of what Ofcom does from this point onwards, so it would be remarkable if it did not, but I think we will have to return to the points that my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and others have made about the nature of parliamentary scrutiny that is then required to ensure that we are all on top of this progress as it develops.
I was talking about what I would like my hon. Friend the Minister to say when he winds up the debate. I know he will not have a huge amount of time to do so, but he might also confirm that the balancing duties in relation to freedom of speech and privacy, for example, continue to apply to the fulfilment of the safety duties in this context as well. That would be helpful.
The Government amendments in lieu do not replicate the reference to design in the safety duties themselves, but I do not see that as problematic because, as I understand it, the risks identified in the risk assessment process, which will now include design risks, feed through to and give rise to the safety duties, so that if a design risk is identified in the risk assessment, a service is required to mitigate and address it. Again, I would be grateful if the Minister confirmed that.
We should also recognise that Government amendment (b) in lieu of Lords amendment 17 and Government amendments (b) and (c) in lieu of Lords amendment 81 specifically require consideration of
“functionalities or other features of the service that affect how much children use the service”
As far as I can tell, that introduces consideration of design-related addiction—recognisable to many parents; it cannot just be me—into the assessment process. These changes reflect the reality of how online harm to children manifests itself, and the Government are to be congratulated on including them, although, as I say, the Government and, subsequently, Ofcom will need to be clear about what these new expectations mean in practical terms for a platform considering its risk assessment process and seeking to comply with its safety duties.
I now turn to the amendments dealing with the categorisation process, which are Lords amendment 391 and the Government amendments arising from it. Lords amendment 391 would allow Ofcom to designate a service as a category 1 service, with the additional expectations and responsibility that brings, if it is of a certain scale or if it has certain functionalities, rather than both being required as was the case in the original Bill. The effect of the original drafting was, in essence, that only big platforms could be category 1 platforms and that big platforms were bound to be category 1 platforms. That gave rise to two problems that, as my hon. Friend the Minister knows, we have discussed before.
I do not think I need to respond to that, but it goes to show does it not?
My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.
My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.
The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.
My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.
It has just crossed my mind that the Minister might be saying that he agreed with everything that I said, which cannot be right. Let me be clear about the two points. One was in relation to whether, when we look at design harms, both proportionality and balancing duties are relevant—I think that he is saying yes to both. The other point that I raised with him was around encryption, and whether I put it in the right way in terms of the Government’s position on encryption. If he cannot deal with that now, and I would understand if he cannot, will he write to me and set out whether that is the correct way to see it?
I thank my right hon. Friend for that intervention. Indeed, end-to-end encrypted services are in the scope of the Bill. Companies must assess the level of risk and meet their duties no matter what their design is.