(1 day, 16 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That this House has considered the implementation of the Online Safety Act 2023.
It is a great pleasure to serve under your chairmanship, Mr Stringer, and I am grateful for the opportunity to open the debate. Let me start with some positives. The Online Safety Act 2023 is certainly not the last word on the subject, but it is, in my view, a big step forward in online safety, providing a variety of tools that allow the regulator to make the online world safer, particularly for children. I remain of the view that Ofcom is the right regulator for the task, not least because it can start its work sooner as an existing regulator and given the overlap with its existing work—for example, on video-sharing platforms. I also have great regard for the diligence and expertise of many at Ofcom who are now charged with these new responsibilities. However, I am concerned that Ofcom appears unwilling to use all the tools that the Act gives it to make the online world a safer place, and I am concerned that the Government appear unwilling to press Ofcom to be more ambitious. I want to explain why I am concerned, why I think it matters and what can be done about it.
Let me start with what I am worried about. There was a great deal of consensus about the passing of the Online Safety Act, and all of us involved in its development recognised both the urgent need to act on online harms and the enormity of the task. That means that the eventual version of the Act does not cover everything that is bad online and, of necessity, sets up a framework within which the regulator is required to fill in the gaps and has considerable latitude in doing so.
The architecture of that framework is important. Because we recognised that emerging harms would be more clearly and quickly seen by online services themselves than by legislators or regulators, in broad terms the Act requires online services to properly assess the risk of harms arising on their service and then to mitigate those risks. My concern is that Ofcom has taken an unnecessarily restrictive view of the harms it is asking services to assess and act on and, indeed, a view that is inconsistent with the terms of the Act. Specifically, my conversations with Ofcom suggest to me that it believes the Act only gives it power to act on harms that arise from the viewing of individual pieces of bad content. I do not agree, and let me explain why.
With limited exceptions, if an online service has not identified a risk in its risk assessment, it does not have to take action to reduce or eliminate that risk, so which risks are identified in the risk assessment really matters. That is why the Act sets out how a service should go about its risk assessment and what it should look out for. For services that may be accessed by children, the relevant risk assessment duties are set out in section 11 of the Act. Section 11(6) lists the matters that should be taken into account in a children’s risk assessment. Some of those undoubtedly refer to content, but some do not. Section 11(6)(e), for example, refers to
“the extent to which the design of the service, in particular its functionalities”
affects the risk of adults searching for and contacting children online. That is not a risk related to individual bits of content.
It is worth looking at section 11(6)(f), which, if colleagues will indulge me, I want to quote in full. It says that a risk assessment should include
“the different ways in which the service is used, including functionalities or other features of the service that affect how much children use the service (for example a feature that enables content to play automatically), and the impact of such use on the level of risk of harm that might be suffered by children”.
I think that that paragraph is talking about harms well beyond individual pieces of bad content. It is talking about damaging behaviours deliberately instigated by the design and operation of the online service, and the way its algorithms are designed to make us interact with it. That is a problem not just with excessive screen time, on which Ofcom has been conspicuously reluctant to engage, but with the issue of children being led from innocent material to darker and darker corners of the internet. We know that that is what happened to several of the young people whose suicides have been connected to their online activity. Algorithms designed to keep the user on the service for longer make that risk greater, and Ofcom seems reluctant to act on them despite the Act giving it powers to do so. We can see that from the draft code of practice on harm to children, which Ofcom published at the end of last year.
This debate is timely because the final version of the code of practice is due in the next couple of months. If Ofcom is to change course and broaden its characterisation of the risks that online services must act on—as I believe it should—now is the time. Many of the children’s welfare organisations that we all worked with so closely to deliver the Act in the first place are saying the same.
If Ofcom’s view of the harms to children on which services should act falls short of what the Act covers, why does it matter? Again, the answer lies in the architecture of the Act. The codes of practice that Ofcom drafts set out actions that services could take to meet their online safety duties. If they do the things that they set out, they are taken to have met the relevant safety duty and are safe from regulatory penalty. If in the code of practice Ofcom asks services to act only on content harms, it is highly likely that that is all services will do because it is compliance with the code that provides regulatory immunity. If it is not in the code, services probably will not do it. Codes that ignore some of the Act’s provisions to improve children’s safety means the online services that children use will ignore those provisions, too. We should all be worried about that.
That brings me to the second area where I believe that Ofcom has misinterpreted the Act. Throughout the passage of the Act, Parliament accepted that the demands that we make of online services to improve the safety of their users would have to be reasonable, not least to balance the risks of online activity with its benefits. In later iterations of the legislation, that balance is represented by the concept of proportionality in the measures that the regulator could require services to take. Again, Ofcom has been given much latitude to interpret proportionality. I am afraid that I do not believe it has done so consistently with Parliament’s intention. Ofcom’s view appears to be that for a measure to be proportionate there must be a substantial amount of evidence to demonstrate its effectiveness. That is not my reading of it.
Section 12 of the Act sets out the obligation on services to take proportionate measures to mitigate and manage risks to children. Section 13(1) offers more on what proportionate means in that context. It states:
“In determining what is proportionate for the purposes of section 12, the following factors, in particular, are relevant—
(a) all the findings of the most recent children’s risk assessment (including as to levels of risk and as to nature, and severity, of potential harm to children), and
(b) the size and capacity of the provider of a service.”
In other words, a measure that would be ruinously expensive or disruptive, especially for a smaller service, and which would deliver only a marginal safety benefit, should not be mandated, but a measure that brings a considerable safety improvement in responding to an identified risk, even if expensive, might well be justified.
Similarly, when it comes to measures recommended in a code of practice, schedule 4(2)(b) states those measures must be
“sufficiently clear, and at a sufficiently detailed level, that providers understand what those measures entail in practice”,
and schedule 4(2)(c) states that recommended measures must be “proportionate and technically feasible”, based on the size and capacity of the service. We should not ask anything of services they cannot do, and it should be clear what they have to do to comply. That is what the Act says proportionality means. I cannot find in the Act support for the idea that we have to know something will work before we try it in order for that action to be proportionate and therefore recommended in a code of practice. Why does that disagreement on interpretation matter? Because we should want online platforms and services to be innovative in how they fulfil their safety objectives, especially in the fast-moving landscape of online harms. I fear that Ofcom’s interpretation of proportionality, as requiring evidence of effectiveness, will achieve the opposite.
There will only be an evidence base on effectiveness for a measure that is already being taken somewhere, and that has been taken for long enough to generate that evidence of effectiveness. If we limit recommended actions to those that have evidence of success, we effectively set the bar for safety measures at current best practice. Given the safe harbour offered by measures recommended in codes of practice, that could mean services being deterred from innovating, because they get the protection only by doing things that are already being done.
I thank the right hon. and learned Gentleman for securing this incredibly important debate. He has described in his very good speech how inconsistency can occur across different platforms and providers. As a parent of a 14-year-old daughter who uses multiple apps and platforms, I want confidence about how they are regulated and that the security measures to keep her safe are consistent across all platforms she might access. My responsibility as a parent is to match that. The right hon. and learned Gentleman rightly highlights how Ofcom’s interpretation of the Act has led to inconsistencies and potential grey areas for bad faith actors to exploit, which will ultimately damage our children.
The hon. Gentleman makes an interesting point. We have to balance two things, though. We want consistency, as he suggests, but we also want platforms to respond to the circumstances of their own service, and to push the boundaries of what they can achieve by way of safety measures. As I said, they are in a better position to do so than legislators or regulators are to instruct them. The Act was always intended to put the onus on the platforms to take responsibility for their own safety measures. Given the variety of actors and different services in this space, we are probably not going to get a uniform approach, nor should we want one. The hon. Gentleman is right to say that the regulator needs to ensure that its expectations of everyone are high. There is a further risk not that we might just fix the bar at status quo but that, because of the opportunity that platforms have to innovate, some might go backwards on new safety measures that they are already implementing because they are not recommended or encouraged by Ofcom’s code of practice. That cannot be what we want to happen.
Those are two areas where I believe Ofcom’s interpretation of the Act is wrong and retreats in significant ways from Parliament’s intention to give the regulator power to act to enhance children’s online safety. I also believe it matters that it is wrong. The next question is what should be done about it. I accept that sometimes, as legislators, we have no choice but to pass framework legislation, with much of the detail on implementation to come later. That may be because the subject is incredibly complex, or because the subject is fast-moving. In the case of online safety, it is both.
Framework legislation raises serious questions about how Parliament ensures its intentions are followed through in all the subsequent work on implementation. What do we do if we have empowered regulators to act but their actions do not fulfil the expectations that we set out in legislation?
Does the right hon. and learned Gentleman agree that this is not only about Ofcom but regulators more widely, and their ability to be agile? Does he believe them to be more risk-averse in areas such as digital technology, relying on traditional consultation time periods, when the technology is moving way faster?
The hon. Gentleman identifies a real risk in this space: we are always playing catch-up, and so are the regulators. That is why we have tried—perhaps not entirely successfully—to design legislation that gives the regulators the capacity to move faster, but we have to ask them to do so and they have to take responsibility for that. I am raising these points because I am concerned that this particular regulator in this particular set of circumstances is not being as fleet of foot as it could be, but the hon. Gentleman is right that this is a concern across the regulatory piece. I would also say that regulators are not the only actor. We might expect the Government to pick up this issue and ensure that regulators do what Parliament expects, but in this area the signs are not encouraging.
As some Members in Westminster Hall this morning know because they were present during the debates on it, elsewhere in the Online Safety Act there is provision to bring forward secondary legislation to determine how online services are categorised, with category 1 services being subject to additional duties and expectations. That process was discussed extensively during the passage of the Act, and an amendment was made to it in the other place to ensure that smaller platforms with high incidences of harmful content could be included in category 1, along with larger platforms. That is an important change, because some of the harm that we are most concerned about may appear on smaller specialist platforms, or may go there to hide from the regulation of larger platforms. The previous Government accepted that amendment in this House, and the current Government actively supported it in opposition.
I am afraid, however, that Ofcom has now advised the Government to disregard that change, and the Government accepted that advice and brought a statutory instrument to Committee on 4 February that blatantly contravenes the will of Parliament and the content of primary legislation. It was a clear test case of the Government’s willingness to defend the ambition of the Online Safety Act, and I am afraid they showed no willingness to do so.
If we cannot rely on the Government to protect the extent of the Act—perhaps we should not, because regulatory independence from the Executive is important—who should do it? I am sure the Minister will say in due course that it falls within the remit of the Science, Innovation and Technology Committee. I mean no disrespect to that Committee, but it has a lot on its plate already and supervision of the fast-moving world of online safety regulation is a big job in itself. It is not, by the way, the only such job that needs doing. We have passed, or are in the process of passing, several other pieces of similar framework legislation in this area, including the Digital Markets, Competition and Consumers Act 2024, the Data (Use and Access) Bill and the Media Act 2024, all of which focus on regulators’ power to act and on the Secretary of State’s power to direct them. Parliament should have the means to oversee how that legislation is being implemented too.
Many of these areas overlap, of course, as regulators have recognised. They established the Digital Regulation Co-operation Forum to deal with the existing need to collaborate, which of course is only likely to grow with the pervasive development of artificial intelligence. Surely we should think about parliamentary oversight along the same lines. That is why I am not the first, nor the only, parliamentarian to be in favour of a new parliamentary Committee—preferably a Joint Committee, so that the expertise of many in the other place can be utilised—to scrutinise digital legislation. The Government have set their face against that idea so far, but I hope they will reconsider.
My final point is that there is urgency. The children’s safety codes will be finalised within weeks, and will set the tone for how ambitious and innovative—or otherwise—online services will be in keeping our children safe online. We should want the highest possible ambition, not a reinforcement of the status quo. Ofcom will say, and has said, that it can always do more in future iterations of the codes, but realistically the first version will stand for years before it is revised, and there will be many missed opportunities to make a child’s online world safer in that time. It is even less likely that new primary legislation will come along to plug any gaps anytime soon.
As the responsible Secretary of State, I signed off the online harms White Paper in 2019. Here we are in 2025, and the Online Safety Act is still not yet fully in force. We must do the most we can with the legislation we have, and I fear that we are not.
Given the efforts that were made all across the House and well beyond it to deliver the best possible set of legislative powers in this vital area, timidity and lack of ambition on the part of Ministers or regulators—leading to a pulling back from the borders of this Act—is not just a challenge to parliamentary sovereignty but, much more importantly, a dereliction of duty to the vulnerable members of our society, whose online safety is our collective responsibility. There is still time to be braver and ensure that the Online Safety Act fulfils its potential. That is what Ofcom and the Government need to do.
I remind hon. and right hon. Members to bob if they wish to speak. I intend to call the Front-Bench spokespeople at half-past 10 so I will impose a four-minute limit on speeches. That gives very little scope for interventions though it is up to hon. Members whether to take them, but I may have to reduce the time limit.
Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.
As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.
My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.
I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.
Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.
The Minister might be about to come to the point I want to raise with her, which is about proportionality. Will she say something about that? I am keen to understand whether the Government accept Ofcom’s understanding of the term—that proportional measures are those measures that can be evidenced as effective. I gave reasons why I am concerned about that. I want to understand whether the Government believe that that is the correct interpretation of proportionality.
I was about to come to the point that the right hon. and learned Member raised about the digital regulation Committee. I have had a brief conversation with him about that, and agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act. I welcome the expertise that Members of both Houses bring. Select Committees are a matter for the House, as he is aware.
We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as other parliamentary Committees that may have an interest in the Act. The Act requires the Secretary of State to review the effectiveness of the regime, two to five years after the legislation comes into force. We will ensure that Parliament is central to that process. I encourage the right hon. and learned Member to continue to raise the matter with the right people.
Most hon. Members raised the issue of apps. Ofcom will have a duty to publish a report on the role of app stores and children’s accessing harmful content on the apps of regulated services. The report is due between January ’26 and January ’27. Once it is published, the Secretary of State may, if appropriate, make regulations to bring app stores into the scope of the Act. The timing will ensure that Ofcom can prioritise the implementation of child safety duties. I will write to the right hon. and learned Member for Kenilworth and Southam on the issue of proportionality, as I want to ensure that I give him the full details about how that is being interpreted by Ofcom.
We fully share the concerns of hon. Members over small platforms that host incredibly harmful content, such as hate forums. These dark corners of the internet are often deliberately sought out by individuals who are at risk of being radicalised.
I am grateful to everyone who has spoken in the debate. We have talked about the consensus there was in the passage of the Online Safety Bill. I think it is fair to say that that consensus is broadly still present, based on what Members have said this morning, and I am grateful for it.
There is a need to get this Act implemented. I accept what the Minister says about that, and others have made the same point: we do not want to make the best the enemy of the good, and there is always a trade-off between, on the one hand, getting the particular mechanisms that we know will protect people online in place as swiftly as possible, and on the other hand, making them as extensive and effective as possible.
However, given how long it takes for Parliament to make change—I make no apologies for repeating this point—we need to make the best use of the legislation that we have. I have not made a case this morning for extending the parameters of the legislation; I have made a case for using the parameters we already have, which Parliament has already legislated into being and which we have passed over to the regulator for it to use.
I accept that regulation and legislation is not passed for effect; we do it so that it can work. We do it not to make ourselves feel better, but to make the lives of our constituents better, so the Minister is right to say that the usability of all this should be at the heart of what we are interested in. I accept the point made by the hon. Member for Esher and Walton (Monica Harding) that Ofcom should not be predominantly focused on insulating itself from judicial review. As a former Law Officer, I think that is an impossible task anyway. This legislation and the regulation that follows it will be challenged—the online platforms have every incentive to challenge it. We cannot be so terrified of that prospect that we are unwilling to extend the parameters of the regulation as far as we believe they should go. That is why I think everybody needs to be a tad braver in all this.
Finally, I simply want to repeat the point that many of us have made, which is that we need as Parliament to have a way of keeping our eye on what is happening in this space. These debates are great, but shouting at Ofcom through the loudhailer of Westminster Hall is not as effective as a Committee set up to do this in a more structured and, frankly, a more productive and consensual way. That is the gap that exists in the landscape of parliamentary oversight, and as we develop more and more digital regulation, as we have to, and as AI advances, we will have to fill that gap. I simply say to the Government that filling it sooner rather than later would be wise.
Question put and agreed to.
Resolved,
That this House has considered the implementation of the Online Safety Act 2023.
(2 weeks, 1 day ago)
Commons ChamberIt is imperative that we reassure people up and down the country that their data will be used safely and wisely, and that they will always remain in control of how their data is used. I can give my hon. Friend those reassurances. The House will notice that this Government have acted with transparency when it comes to informing the public how data and the algorithms that process that data are being used. Just last week I released more algorithms for public scrutiny, so that they can be put into the algorithm playbook that we have released. From Department to Department, more of those algorithms will be made available as our resources allow. That is just one example of how we are using transparency to earn the public’s trust. In the year before the general election, just one Department released an algorithm for public scrutiny.
There is a great deal in this Bill that we can all support, but some difficult concepts lurk within it, as I know the Secretary of State will recognise. He is talking about data transparency. One of the issues of concern is about precisely what we mean by the “scientific research” on which data may be employed, and precisely what we mean by “the public interest” that must be served by that scientific research. We will not examine this issue on Second Reading, but may I ask him to commit to a proper examination of those concepts as the Bill moves forward, so that we can all understand what we mean and the public can get the reassurance that he describes?
I am grateful to the right hon. Gentleman for his informed intervention. I can assure him that we take this issue very seriously. I can also assure him that this is one of the issues on which we will go into considerable depth in Committee, and I am sure that his Whips are hearing of his interest in getting on to that Committee. He is clearly volunteering to put in the hard yards to make sure that we get the Bill right.
None of the things that I have outlined will succeed without trust. People will not use technology unless they are confident that it is being used safely, but we often lack the rigorous evidence that we need to take decisions about the safety of our rapidly changing online world. The provisions in this Bill will allow researchers to access data held by platforms, enabling them to conduct robust independent research into online safety. I am grateful to peers for their dedication in rigorously scrutinising these measures. We have listened closely, and in response we have made some important changes to the Bill. First, we have brought forward measures to strengthen data protection for children. Information society service providers likely to be accessed by children will now have clear legal duties to consider how best to protect and support children when designing their data-processing activities.
Secondly, we have added a provision to help charities use email to engage with people who have previously supported their charitable purposes. Thirdly, we have committed to making it easier for people to navigate data protection measures in a world transformed by technology. In two rapidly growing sectors—automated decision making and edtech—we will ask the Information Commissioner’s Office to publish codes of practice to give people the knowledge and confidence they need to use personal data legally.
I am grateful for my hon. Friend’s work on the Culture, Media and Sport Committee in scrutinising these areas and for being a voice for the sector. It goes without saying that I would be delighted to meet the people he references, and the same goes for Members on both sides of the House. Whether I can fit every one of the 2.5 million people who work in the sector into my office, I do not know. It is a bigger office than I had seven months ago, but I am not sure I can fit everyone in. However, I will do my absolute best; I am here to listen and learn, as I have been from the outset, and I am here to find a way through. It is time to reconcile these issues and to give certainty to people in both the creative arts sector and the technology sector. I believe the Bill is the moment for this House to provide the certainty that both sides need as we move forward.
Fifthly and finally, let me say a word on Lord Lucas’s amendments. People will use digital identities to buy a house, to rent a car and to get a job. The intention of clause 45(6) is to force public authorities to share whether someone’s information, such as their sex, has changed when disclosing information under clause 45 as part of a digital verification check. That would mean passing on an excessive amount of personal data. Sharing such changes by default would be an unjustifiable invasion of people’s privacy, and I am unable to say that clause 45(6) is compatible with human rights law, which is why we will seek to overturn the amendment.
The Secretary of State is very generous in giving way. Before he finishes, may I ask him about the situation we are creating with this Bill and the Online Safety Act 2023 of setting a framework within which regulators need to operate and cover a good deal of ground? Does he think the advent of these pieces of legislation makes a stronger case for a new Committee of this House, and perhaps a Joint Committee, to maintain scrutiny of ongoing digital regulation? If so, will he be prepared to advance that case?
That is the right hon. and learned Gentleman’s second audition of the day. I am open-minded on these issues, and I take leadership from the Leader of the House on Committee matters.
(3 weeks, 2 days ago)
General CommitteesIt is a great and unexpected pleasure to serve under your chairmanship, Sir Christopher. I want to take this opportunity to say something about why I think these regulations are a mistake. I agree with a great deal of what the hon. Member for Aberdeen North (Kirsty Blackman) has just said—I will seek not to repeat it—but it is probably worth noting at the outset that, as the Minister has rightly explained, these regulations are not the only means by which we will hold online services to account under this legislation.
A category 1 designation allows Ofcom—the regulator —to impose additional constraints on a platform. I think that is an entirely fair point to make, but as the hon. Lady observed, something like 100,000 online services are likely to be in scope of this Act overall. It is worth noting that, in Ofcom’s assessment, something like 12 to 16 services only would qualify for category 1 status if, as is currently the case, size was the only criterion and we set the limit—as these regulations seek to do—at 7 million monthly users.
As the hon. Lady explained, over a considerable period of time, with a considerable amount of energy expended, Parliament decided that it was appropriate to include in the category 1 designation not just the largest services, but those services where a great deal of harm may be concentrated but the services are, in themselves, much smaller. Those services being smaller might happen organically, or it might, of course, happen because that harmful content seeks refuge from the regulation applied to the larger services by migrating to smaller ones.
There is good reason, therefore, to think that having smaller services potentially included in category 1 designation is a tool that Ofcom, and indeed the Government, will want to have available.
Those platforms, such as ones that specialise in suicide or self-harm, might well be the kind of platforms that we find ourselves increasingly concerned about and that the Government will increasingly be asked to do something about. I have to say to the Minister that it is not sensible to remove from the regulator’s hand the tools that it might want to use to do what the Government will undoubtedly ask it to do—the Government themselves will come under pressure to do something about that.
Again, as has been explained, what or who we include in that category 1 designation really matters, because of the additional powers and constraints that Ofcom will have available to it in relation to category 1 services. Those powers include the only powers available under this Act to protect adults from anything that is not illegal content—including vulnerable adults, by the way. There will come a time when the Government, I suspect, will wish they had more to deal with problems of that nature. As the hon. Member for Aberdeen North explained, the Act gives those powers, so it is bizarre in the extreme that the Government should choose voluntarily not to use them. It is bizarre, also, because the Labour party in opposition was clear in its support for the change.
The hon. Member for Newton Abbot quoted one example of something that the shadow spokesman at the time, the hon. Member for Pontypridd (Alex Davies-Jones), who now has Government responsibilities elsewhere, said during the passage of the Bill. I will quote another example to the Committee. She said:
“Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.”––[Official Report, Online Safety Public Bill Committee, 12 July 2022; c. 168.]
I think she was absolutely right then, and still is now. The draft regulations, I am afraid, do exactly what she said the Act should not do: they limit the criterion for the designation of category 1, and these additional powers, to size only.
We should think about the Government’s rationale for what they are doing. In December, the Secretary of State made a written statement to set out the reasoning for the measures that the Government have put before the Committee:
“In making these Regulations, I have considered factors as required by the Act. Amendments made during the passage of the Act, changed the consideration for Category 1 from the ‘level of risk of harm to adults from priority content that is harmful to adults disseminated by means of the service’ to ‘how easily, quickly and widely regulated user-generated content is disseminated by means of the service.’ This was a significant change”.—[Official Report, 16 December 2024; Vol. 759, c. 12WS.]
In other words, I think the Secretary of State was arguing that he has no option but to limit to a scale criterion-only designation for category 1, because that is how the Act has changed. That is fundamentally mistaken, if I may say so to the Minister. I do not expect her to have all this before her—I know her officials will take careful note—but the Act states at paragraph 1(5) of schedule 11:
“In making regulations under sub-paragraph (1)”—
the draft regulations we are discussing—
“the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on”—
and this is the part the Secretary of State drew out in his statement—
“how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”
Without doubt, therefore, the Secretary of State has to take the number of users into account, but it is not the only criterion. There is a fundamental misunderstanding —at least, I hope that is what it is—in the ministerial statement, which suggests that that is the only criterion to be considered. It is not, and I think it is a mistake to ignore the others, which, again, have already been drawn out in the debate.
To be clear, these draft regulations mean that no smaller platform—under the level of 7 million monthly users—can ever be considered as a category 1 platform, unless or until the Government and Ofcom change their approach to the categorisation process. I repeat the point, and I make no apologies for doing so, that that is specifically contrary to what Parliament had intended in the passage of the Act.
The hon. Member for Aberdeen North and I are not the only ones making this observation. There are multiple organisations with whom we and then the Labour party worked closely to get this Act passed for the protection of those about whom the Labour party is charged with worrying. Those include organisations such as the Samaritans, Mind, the Centre for Countering Digital Hate, the Antisemitism Policy Trust and the Molly Rose Foundation, all of which care deeply about the effectiveness of this legislation, as I am sure we all do.
It is true, and the Minister may make this point, that Ofcom’s advice suggested the course of action the Government are now taking. However, “advice” is the key word. The Government were not obliged to take it, and in this instance I think they would have been wiser to resist it. Ofcom will not have all the tools it could have to deal with smaller services where greater harm may be concentrated, despite what the Act allows. I have to say that tying one hand behind Ofcom’s back is not sensible, even when Ofcom is itself asking us to do so. That is especially true when the Government place such heavy reliance on the Online Safety Act—as they are entitled to—to deal with the multiple online harms that arise.
I have lost count, as I suspect others in this Committee have, of the number of times that Ministers have referred to the Online Safety Act when challenged about harmful materials or behaviours online and said, “This is the answer. This Act gives us powers to act against services that do not do what they should.” They are right that it is not a perfect piece of legislation, and none of us involved in its generation would claim that it was, but it does give Government and regulators the powers to act. However, that does us no good at all if, in subsequent pieces of statutory legislation, the Government choose not to use those tools or put them beyond Ofcom’s reach. That is what the regulations do.
I have to say to the Minister that government is hard enough. She should not throw away the tools she needs to do the job that she has promised everyone that she will do. This is a mistake, and I hope that even at this late stage the Minister will find a way to avoid making it.
The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.
The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.
I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.
I am extremely grateful to the Minister for giving way, and I have sympathy with her position, especially in relation to legal advice, having both received it and given it. I suggest that the Minister is talking about two different things, and they need to be separated. The first is the question of whether legal but harmful content was removed from the Bill, which it undoubtedly was. Measures in relation to content that is neither unlawful nor harmful to children were largely removed from the Bill—the Minister is right to say that.
What we are discussing, however, are the tools available to Ofcom to deal with those platforms that it is still concerned about in relation to the remaining content within the ambit of the Bill. The worry of those of us who have spoken in the debate is that the Government are about to remove one of the tools that Ofcom would have had to deal with smaller, high-harm platforms when the harm in question remains in ambit of the Bill—not that which was taken out during its passage. Would the Minister accept that?
I will again set out what the Secretary of State’s powers are. The Government have considered the suggestion of Baroness Morgan and others to categorise small but risky based on the coroner or Ofcom linking a service to a death. The Government were grateful for that suggestion. However, there were issues with that approach, including with what the Act allows the Secretary of State to consider when setting the categories. The Secretary of State is not allowed to consider anything other than the factors set out in the Act, which says that it has to include easy, quick and wide dissemination for category 1, and has to be evidence based.
I hope that the hon. Member for Aberdeen North will accept that I will write to her in great detail, and include a letter from Government lawyers setting out what I am saying in relation to the powers of the Secretary of State in setting the categories. I hope that she will be satisfied with that. I want to make it clear that we are not taking anything out; the Secretary of State is proceeding with the powers that he has been given.
Thank you, Sir Christopher—I appreciate that prod. I did look at Standing Orders this morning, but could not find that bit, so that is incredibly helpful.
On what the Minister said about schedule 11 and the notes that she has been passed from her team on that point, I appreciate her commitment to share the Government’s legal advice. That will be incredibly helpful; it would have been helpful to have it in advance of this Committee.
In schedule 11, it says:
“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”
Perhaps I cannot read English, or perhaps the Minister, her legal advisers and the team at DSIT read it in a different way from me, but the Secretary of State having to take something into account and the Secretary of State being bound by something are two different things—they are not the same. It does not say that the Secretary of State must regulate only on the specific number of users.
In fact, schedule 11 says earlier that the Secretary of State
“must make regulations specifying conditions…for the user-to-user part of regulated user-to-user services relating to each of the following”,
which are the
“number of users…functionalities of that part of the service, and…any other characteristics of that part of the service or factors”.
The Secretary of State must therefore make regulations in relation to any other characteristics of that part of the service or factors
“relating to that part of the service that the Secretary of State considers relevant.”
He must do that, but he must only take into account the number of users. The Government, however, have decided that taking into account is much more important than “must” do something. They have decided to do that despite Parliament being pretty clear in the language it has used.
I am not terribly happy with the Online Safety Act. It is a lot better than the situation we have currently, but it is far from perfect. As the Minister said, I argued in favour of keeping the stuff about legal but harmful content for adults. I argued against the then Government’s position on that, but the Act is the Act that we have.
The Minister’s point does not make sense. The Secretary of State has to take into account the number of users and how quickly things are disseminated, but he must make regulations about functionalities or factors that he considers relevant. Therefore, it seems that he does not consider suicide forums and livestreaming to be relevant; if he did, he would surely be bound by the “must” and would have to make regulations about them. It is frustrating that the Act does not do what it is supposed to do and does not protect young people from livestreaming. The Minister said that it protects people from seeing that illegal content, but it does not prevent them from creating it.
The Government could make regulations so that every platform that has a livestreaming functionality, or even every platform that has child users on it—there is a lot in the Act about the proportion of children who use a service—is automatically included in category 1 because they consider them to be high risk.
It would not be right for either of us to ask the Minister to disclose legal advice—that clearly would not be appropriate—but I am grateful for the Minister’s offer to share a slightly more expansive description of why the Government have come to the conclusion that they have.
On the hon. Lady’s point about what the Act actually says, we have both quoted paragraph 1(5) of schedule 11, which deals with whether the language that has found its way into the ministerial statement is the be-all and end-all of the Minister’s conclusions. We both think it is not. If it is the case, as I think the Minister is arguing, that the ability to disseminate “easily, quickly and widely” is essentially a synonym for the scale of the service and the number of its users, what does the hon. Lady think of the amendment that Baroness Morgan made in the other place to paragraph 1(4), which says that when the regulations we are considering specify
“the way or ways in which the relevant conditions are met”,
for category 1 threshold conditions
“at least one specified condition about number of users or functionality must be met”?
The crucial word that was added is “or”. If the number of users were required to establish what the hon. Lady has described, the word “or” would be inappropriate.
I absolutely agree, and that is a helpful clarification.
If the Government have decided that it is too difficult to regulate high-risk platforms as category 1, and that they do not matter enough because they do not have enough of an impact, they should stand up and tell us that. Rather than saying that their hands have been tied by the Act—they manifestly have not—they need to take ownership of their actions. If they have decided that such platforms are not important enough or that they cannot be bothered having a fight with Ofcom about that, they should be honest and say, “This is the position we have decided to take.” Instead, they are standing up and saying, “Our hands have been tied,” but that is just not correct: their hands have not been tied by the Act.
I appreciate that the Minister will get in touch with me about the legal advice, but it will be too late. This statutory instrument will have been through the process by that time, and people will have been put at risk as a result of the Government’s failure. They have the power to take action in relation to functionalities and factors, and in relation to suicide forums, livestreaming and the creation of child sexual abuse material, and they are choosing not to.
If the Government have decided that it is too difficult to do that, that those platforms are not risky enough and that not enough people are being harmed by them, they need to hold their hands up and say, “We’ve decided that this is the position we are going to take.” They must not hide behind the legislation, which does not say what they are telling us it says. They should just be honest about the fact that they have decided that they cannot be bothered to take action. They cannot be bothered to have a fight with Ofcom because it is not important enough. Hiding behind the legislation is incredibly cowardly—it does not say that.
(3 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a great pleasure to serve under your chairmanship, Mr Dowd. I congratulate the hon. Member for Darlington (Lola McEvoy) not just on securing this debate but on the way in which she made her case. I want to focus on a couple of the more technical aspects of the Online Safety Act, which are important in fulfilling the objectives that we all share this afternoon, which, as she rightly said, are to make sure that the vehicle that we now have in the OSA delivers the right outcomes for the safety of children online.
I am grateful to my hon. Friend the Member for Gosport (Dame Caroline Dinenage); she is right that I had ministerial responsibility for the Act. I think, frankly, it is harder to find Conservative Ministers who did not have responsibility for it at some point or another, but what we all tried to do was make sure that the structure of the Act would support the objectives that, again, we all share.
I will mention two specific things, which I should be grateful if the Minister would consider. I do not expect her to respond to them this afternoon, but if she would consider them and write to me, I should be very grateful.
It seems to me that we need to make sure that as responsibility for implementing the Act moves from us as legislators to Ofcom as the regulator, Government and Parliament and the regulator are on the same page. There are two areas where I am concerned that that might not be the case. The first is the question whether harm to children is all about content. I do not think it is. We have heard this afternoon that many aspects of risk and harm to children online have nothing to do with the specific nature of an individual piece of content.
The Act is important, and I believe it does support Ofcom’s ability to act in relation to harms beyond specific matters of content. For the Minister’s benefit, I have in mind section 11 of the Act on risk assessment—as she will know, because she knows it off by heart. For everybody else here, section 11 deals with risk assessment, and on that a great deal hangs. If we do a risk assessment, the obligation is to do something about risks, and that hangs on what risks are identified in the assessment. So the risk assessment matters.
As I read the Act, section 11 says that, yes, we must risk-assess for individual harmful pieces of content, but under section 11(6)(f) we also must risk-assess for the different ways that the service is used, including functionalities or other features of the service that affect how much children use the service—which goes back to a point made earlier. Those are the sorts of things it is important to underline that we expect Ofcom to attend to.
I am grateful for the Government’s statement of strategic priorities, but the point made about this being a fast-moving landscape is fundamental. Again in the Act, the codes of practice are vital, because they set out the things that platforms ought to do to keep children safe. If the platforms do the things set out in the codes, they are broadly invulnerable from further regulatory intervention. We need to act urgently to ensure that the codes of practice say what we want them to say. At the moment my concern is that Ofcom may simply talk about current good practice and not urge advancements in good practice to be maintained by the platforms. Those are the two areas that I hope the Minister will think about in relation to the draft codes and the need for an ongoing relationship between us in Parliament and Government and Ofcom to ensure that the Act continues to deliver as we want it to.
I thank the hon. Member for making that point and I absolutely welcome that intervention by internet providers. As I will go on to say, internet providers do not have to wait for the Act to be enacted; they can start making such changes now. I absolutely agree with him.
Many colleagues have raised the issue of the adequacy of the Online Safety Act. It is a landmark Act, but it is also imperfect. Ofcom’s need to consult means a long lead-in time; although it is important to get these matters right, that can often feel frustrating. None the less, we are clear that the Government’s priority is Ofcom’s effective implementation of the Act, so that those who use social media, especially children, can benefit from the Act’s wider reach and protections as soon as possible. To that end, the Secretary of State for Science, Innovation and Technology became the first Secretary of State to set out a draft statement of strategic priorities to ensure that safety cannot be an afterthought but must be baked in from the start.
The hon. Member for Strangford (Jim Shannon) raised the issue of suicide and self-harm. Ofcom is in the process of bringing the Online Safety Act’s provisions into effect. Earlier this year, it conducted a consultation on the draft illegal content, with one of the most harmful types being content about suicide. Child safety codes of practice were also consulted on. We expect the draft illegal content codes to be in effect by spring 2025, with child safety codes following in the summer.
Under the Act, user-to-user and search services will need to assess the risk that they might facilitate illegal content and must put in place measures to manage and mitigate any such risk. In addition, in-scope services likely to be accessed by children will need to protect children from content that is legal but none the less harmful to children, including pornography, bullying and violent content. The Act is clear that user-to-user services that allow the most harmful types of content must use highly effective age-assurance technology to prevent children from accessing it.
Ofcom will be able to use robust enforcement powers against companies that fail to fulfil their duties. Ofcom’s draft codes set out what steps services can take to meet those duties. The proposals mean that user-to-user services that do not ban harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict those parts of the service that host harmful content. The codes also tackle algorithms that amplify harm and feed harmful material to children, which have been discussed today. Under Ofcom’s proposal, services will have to configure their algorithms to filter out the most harmful types of content from children’s feeds, and reduce the visibility and prominence of other harmful content.
The hon. Member for Aberdeen North (Kirsty Blackman), the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) and others discussed strengthening the codes. Ofcom has been very clear that it will look to strengthen the codes in future iterations. The Government will encourage it to do so as harmful online technology and the evidence base about such technology evolves.
I am short of time, so I will have to proceed.
For example, Ofcom recently announced plans to launch a further consultation on the illegal content duties once the first iteration of those duties is set out in spring next year. That iterative approach enables Ofcom to prioritise getting its initial codes in place as soon as possible while it builds on the foundations set out in that first set of codes.
My hon. Friends the Members for Slough (Mr Dhesi) and for Lowestoft (Jess Asato) and the hon. Member for Aberdeen North raised the issue of violence against girls and women. In line with our safer streets mission, platforms will have new duties to create safer spaces for women and girls. It is a priority of the Online Safety Act for platforms proactively to tackle the most harmful illegal content, which includes offences such as harassment, sexual exploitation, extreme pornography, internet image abuse, stalking and controlling or coercive behaviour, much of which disproportionately affects women and girls. All services in scope of the Act need to understand the risks facing women and girls from illegal content online and take action to mitigate that.
My hon. Friend the Member for Carlisle (Ms Minns) set out powerfully the issues around child sexual exploitation and abuse. Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims. UK law is crystal clear: the creation, possession and distribution of child sexual abuse images is illegal. The strongest protections in the Online Safety Act are against child sexual abuse and exploitation. Ofcom will have strong powers to direct online platforms and messaging and search services to combat that kind of abuse. It will be able to require platforms to use accredited, proactive technology to tackle CSEA and will have powers to hold senior managers criminally liable if they fail to protect children.
I am running short of time, so I shall make some final remarks. While we remain resolute in our commitment to implementing the Online Safety Act as quickly and effectively as possible, we recognise the importance of these ongoing conversations, and I am grateful to everyone who has contributed to today’s debate. I am grateful to the brave parents who continue to fight for protections for children online and shine a light on these important issues. The Opposition spokesperson, the hon. Member for Runnymede and Weybridge (Dr Spencer), asked a host of questions. I will respond to him in writing, because I do not have time to do so today, and I will place a copy in the Library.
(1 year, 3 months ago)
Commons ChamberI am honoured to have been appointed as the Minister with responsibility for tech and the digital economy, and as one of the Ministers with responsibility for the Digital Markets, Competition and Consumers Bill. When I was appointed last Tuesday, many helpful colleagues came up to me to say, “You have been thrown in at the deep end,” but it is a blessing to have responsibility for taking this legislation through the House.
In that vein, I thank my hon. Friend the Member for Sutton and Cheam (Paul Scully) for his tireless work to get the Bill to this stage.
I am aware of the importance of this legislation and the sentiment across the House to deliver the Bill quickly. The benefits of the digital market measures in part 1 of the Bill are clear to see. They will bring about a more dynamic digital economy, which prioritises innovation, growth and the delivery of better outcomes for consumers and small businesses. The rise of digital technologies has been transformative, delivering huge value to consumers and businesses. However, a small number of firms exert immense control across strategically critical services online because the unique characteristics of digital markets, such as network effects and data consolidation, make them prone to tip in favour of a few firms. The new digital markets regime will remove obstacles to competition and drive growth in digital markets, by proactively driving more dynamic markets and by preventing harmful practices such as making it difficult to switch between operating systems.
I turn now to the Government amendments. When the Under-Secretary of State for Business and Trade, my hon. Friend the Member for Thirsk and Malton (Kevin Hollinrake) first stood in the House, he stated that the legislation would unleash the full opportunities of digital markets for the UK. That intention has not changed, and our amendments fully support that. The Government’s amendments to part 1 will provide greater clarity to parties interacting with the regime, enhance the accountability of the regulator and make sure that the legislation is drafted effectively and meets its aims. I will address each of those themes in order.
This new regime is novel. To maximise certainty, it is critical that its parameters—the scopes of the regulator’s functions and the rights and obligations set out in the legislation—are clear. Therefore, the Government have tabled a series of amendments to further clarify how the digital markets regime will work in practice. The amendments relate to how legally binding commitments provided by firms within the scope of the regime will work in practice, the Digital Market Unit’s ability to amend certain decision notices, and how in certain circumstances the DMU may use its investigatory and enforcement powers after a firm is no longer designated.
Two important sets of clarifying amendments are worth covering in more detail. The first relates to conduct requirements. Consumer benefit is a central focus of the digital markets regime. The DMU must consider consumer benefit when shaping the design of its interventions. To reinforce that central focus, we are clarifying how the DMU will consider consumer benefits when imposing and enforcing conduct requirements. Amendment 7 requires the DMU to explain the consumer benefits that it expects to result from a conduct requirement, ensuring transparent, well-evidenced decisions. Amendments 13 and 14 simplify the wording of the countervailing benefits exemption, while critically maintaining the same high threshold.
I draw the House’s attention to my entry in the Register of Members’ Financial Interests. Let me take the opportunity to congratulate my hon. Friend the Member for Meriden (Saqib Bhatti) on his appointment. Does he recognise that it is important to be clear—and for the CMA and the DMU to be clear—that there could be a conflict between the interests of current consumers and those of future consumers? Therefore, it is important that the interests of both are balanced in what the CMA and the DMU eventually decide to do.
My right hon. Friend makes an important point. As I make progress, I hope he will be reassured that the regime will take both those things into account.
Together, amendments 13 and 14 will make sure that consumers get the best outcomes. Amendment 14 makes an important clarification on the role of third parties in the final offer mechanism process. New clause 5 and related amendments will clarify when and how third parties may make collective submissions in relation to the final offer mechanism. That is vital, as collective bargaining can help to address power imbalances during negotiations. We expect that third parties, especially smaller organisations, may seek to work together when negotiating payment terms and conditions.
My second theme is the accountability of the regulator. The discretion afforded to the CMA and its accountability to Government and Parliament have formed a large part of the debate—quite rightly—during the passage of the Bill. I will take time to address that.
The digital markets regime is flexible in its design, with the CMA requiring a level of discretion to deliver effective outcomes. While that is common for ex ante regulation, that does not negate the importance of taking steps to maximise the predictability and proportionality of the regulator’s actions. For that reason, the Government are introducing an explicit requirement for the CMA to impose conduct requirements and pro-competition interventions only where it considers that it is proportionate to do so.
That will make it clear to firms in scope of the regime that they will not be subject to undue regulatory burdens. Firms will be able to challenge disproportionate obligations, and the Competition Appeal Tribunal will, in its consideration of any appeals, apply the principle of proportionality in a reasonable way, as it always does. To complement that, and to ensure consistent senior oversight and accountability of the regime, amendments 57 to 60 require enforcement decisions, including the imposition of penalties, to be reserved to the CMA board or its committee.
My right hon. Friend is always a thoughtful contributor to debates in this House. We believe that the amendments ensure consumer benefit is at the heart of what we are doing and any appeals will be carried out appropriately. Adopting these amendments would bring the digital markets regime into closer alignment with existing CMA mergers and markets regimes, where penalty decisions can be appealed on the merits. As in those regimes, all other decisions are appealable on judicial review principles.
I thank my hon. Friend for giving way again. He will appreciate that we are all trying to get clarity, so we understand what the proposals really mean. In relation to the appeal standard that he describes, for cases that are not specifically related to fines, he mentioned the proportionality addition earlier in his remarks. When it comes to an appeal, are we right to understand that the question of proportionality applies when the CMA originally makes its decision to require an intervention and does not apply to the JR standard that is used to determine an appeal?
It is important to be specific about that, because there are those who would argue that proportionality should be a part of the appeal process. I think the Government amendments say that proportionality applies at an earlier stage and that when it comes to considering whether the CMA has behaved in a proportionate way in making its decisions, the assessment will be made by the Competition Appeal Tribunal on JR principles. Am I right about that?
I agree that that is exactly what we are saying. I am happy to provide further clarity in my closing remarks.
Critical to accountability is, of course, transparency. The Government are committed to transparency and bringing forward amendments that will require the CMA to set out its reasons for imposing or varying a conduct requirement. That will improve transparency around CMA decision making and increase consistency with other powers in the Bill where similar justification is required. It also reinforces the CMA’s existing responsibility to consider likely impacts on consumers when deciding whether and how to intervene.
The third theme is to ensure the legislation is drafted effectively. Therefore, we have tabled further technical amendments to ensure that the Bill’s text meets the Government’s original intended aim. They relate to the scope of conduct requirements, specifically the application of the materiality threshold contained in clause 20(3)(c), the maximum penalty limits imposed on individuals, the mergers reporting duty and the service of notices on undertakings overseas in certain circumstances.
It is worth noting that there are a small number of cross-cutting amendments contained in parts 5 and 6 of the Bill that will also impact the digital markets regime. I want to ensure that there is plenty of time for hon. Members to debate the Bill at this important stage in its passage. I appreciate a collaborative approach from across the House. I am sure that there will be many different views on some of the amendments, but I look forward to a constructive and collaborative discussion.
The hon. Gentleman is right to make that point. That is why in other jurisdictions we have seen agreement reached between big tech and newspaper titles to ensure that there is that element of fairness. I agree with him; I want to see similar fairness and equity applied across the market. What I and others who agree with me are trying to do is to ensure that, in creating this brave new world of energetic and efficient regulation, we do not as a Parliament upset the balance by giving too much power to a particular regulator. A lot of us in this place have watched with concern the failure of other types of regulation—in our water industry or our energy industry, for example. I do not think anybody would deny that, at times, we have got regulation wrong. That is why it is important that we have this debate.
There are people outside this place who have put pressure on us by saying, “The Bill is in perfect order. There is no need for you to look at it any more; great minds have thought about it.” I say to them that it is for this place to make those decisions. I do not look kindly on comments made by the chief executive of the CMA about the merits of what this place is considering while the Bill is in Parliament. I absolutely accept the independence of the CMA and the important role that it plays, but we should not confuse independence with lack of accountability. That is a point that I will warm to in a little while, when I address the relationship between regulators—in this case, the CMA—and Parliament. At the moment, that relationship is wholly inadequate.
I was making the point that, unlike the Competition Act 1998, there is a relative lack of worked-out court interpretation of this Bill’s subject matter. That has led to distinguished commentators—no less than Sir Jonathan Jones, former Treasury counsel—making the point in evidence to the Committee that, in effect, the DMU would be able to decide who was going to regulate, set the rules that apply and then enforce those rules. The phrase “legislator, investigator and executioner” was used. While that is colourful language—perhaps too colourful for a dry debate about competition law—it is important that we reflect on the view of that former Treasury solicitor and be very careful that in going down this road, we are not making false comparisons.
A lot has been said about Ofcom and its decisions, and comparisons have been made, but we must not forget that those Ofcom decisions were heavily governed by EU framework directive 2002/21. Article 4 of that directive says that on ex-ante telecom appeals,
“Member States shall ensure that the merits of the case are duly taken into account and that there is an effective appeal mechanism.”
That is a bit different from the provisions in the Bill. A simple JR-type review is precisely that, and no more.
I listened with interest to the intervention made by my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who made a really good point that needs answering. We need to understand where proportionality comes into this. If the principle of proportionality is being used in the first instance, that is all well and good, but we need to understand how that fits with the provisions of the Bill: whether it implies that the courts deem every decision made by the DMU to be proportionate, or whether there is a way to challenge a particular decision by saying that it was not made according to the DMU’s own principles, acting in a proportionate way.
It seems to me—I would be interested in my right hon. and learned Friend’s view—that on the basis of the Government’s proposed wording, it is more likely that a firm will be able to challenge whether the CMA has applied its proportionality test appropriately, but the means by which it will do so will be under JR principles on appeal, rather than on a merits basis. It is not that proportionality is not subject to challenge, but that that challenge is limited by JR principles at the appeal stage. Does my right hon. and learned Friend agree?
That is what we need to bottom out. The primary worry that a lot of us have about the JR principle is that it means that any challenge will probably be vanishingly small, which is not good for ensuring that the regulator is working in the best way. None of us wants to encourage incontinent litigation—or incontinent legislation, bearing in mind the importance that we place on it—but sometimes, challenge is essential to create greater certainty. There will be ambiguities; there will be occasions where there needs to be a test. We should not be frightened of that.
(1 year, 5 months ago)
Commons ChamberAbsolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.
To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.
As others have done, I welcome the considerable progress made on the Bill in the other place, both in the detailed scrutiny that it has received from noble Lords, who have taken a consistent and expert interest in it, and in the positive and consensual tone adopted by Opposition Front Benchers and, crucially, by Ministers.
It seems that there are very few Members of this House who have not had ministerial responsibility for the Bill at some point in what has been an extraordinarily extensive relay race as it has moved through its legislative stages. The anchor leg—the hardest bit in such a Bill—has been run with dedication and skill by my right hon. Friend the Secretary of State, who deserves all the praise that she will get for holding the baton as we cross the parliamentary finish line, as I hope we are close to doing.
I have been an advocate of humility in the way in which we all approach this legislation. It is genuinely difficult and novel territory. In general, I think that my right hon. Friend the Secretary of State and her Ministers—the noble Lord Parkinson and, of course, the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Sutton and Cheam (Paul Scully)—have been willing to change their minds when it was right to do so, and the Bill is better for it. Like others who have dealt with them, I also thank the officials, some of whom sit in the Box, some of whom do not. They have dedicated—as I suspect they would see it—most of their lives to the generation of the Bill, and we are grateful to them for their commitment.
Of course, as others have said, none of this means that the Bill is perfect; frankly, it was never going to be. Nor does it mean that when we pass the Bill, the job is done. We will then pass the baton to Ofcom, which will have a large amount of further work to do. However, we now need to finalise the legislative phase of this work after many years of consideration. For that reason, I welcome in particular what I think are sensible compromises on two significant issues that had yet to be resolved: first, the content of children’s risk assessments, and secondly, the categorisation process. I hope that the House will bear with me while I consider those in detail, which we have not yet done, starting with Lords amendments 17, 20 and 22, and Lords amendment 81 in relation to search, as well as the Government amendments in lieu of them.
Those Lords amendments insert harmful “features, functionalities or behaviours” into the list of matters that should be considered in the children’s risk assessment process and in the meeting of the safety duties, to add to the harms arising from the intrinsic nature of content itself—that is an important change. As others have done, I pay great tribute to the noble Baroness Kidron, who has invariably been the driving force behind so many of the positive enhancements to children’s online safety that the Bill will bring. She has promoted this enhancement, too. As she said, it is right to recognise and reflect in the legislation that a child’s online experience can be harmful not just as a result of the harm an individual piece of content can cause, but in the way that content is selected and presented to that child—in other words, the way in which the service is designed to operate. As she knows, however, I part company with the Lords amendments in the breadth of the language used, particularly the word “behaviours”.
Throughout our consideration of the Bill, I have taken the view that we should be less interested in passing legislation that sounds good and more interested in passing legislation that works. We need the regulator to be able to encourage and enforce improvements in online safety effectively. That means asking the online platforms to address the harms that it is within their power to address, and to relate clearly the design or operation of the systems that they have put in place.
The difficulty with the wording of the Lords amendments is that they bring into the ambit of the legislation behaviours that are not necessarily enabled or created by the design or operation of the service. The language used is
“features, functionalities or behaviours (including those enabled or created by the design or operation of the service) that are harmful to children”—
in other words, not limited to those that are enabled or created by the service. It is a step too far to make platforms accountable for all behaviours that are harmful to children without the clarity of that link to what the platform has itself done. For that reason, I cannot support those Lords amendments.
However, the Government have proposed a sensible alternative approach in their amendments in lieu, particularly in relation to Lords amendments 17 and Lords amendment 81, which relates to search services. The Government amendments in lieu capture the central point that design of a service can lead to harm and require a service to assess that as part of the children’s risk assessment process. That is a significant expansion of a service’s responsibilities in the risk assessment process which reflects not just ongoing concern about types of harm that were not adequately captured in the Bill so far but the positive moves we have all sought to make towards safety by design as an important preventive concept in online safety.
I also think it is important, given the potential scale of this expanded responsibility, to make clear that the concept of proportionality applies to a service’s approach to this element of assessment and mitigation of risk, as it does throughout the Bill, and I hope the Minister will be able to do that when he winds up the debate.
My right hon. and learned Friend has mentioned Ofcom several times. I would like to ask his opinion as to whether there should be, if there is not already, a special provision for a report by Ofcom on its own involvement in these processes during the course of its annual report every year, to be sure that we know that Ofcom is doing its job. In Parliament, we know what Select Committees are doing. The question is, what is Ofcom doing on a continuous basis?
My hon. Friend makes a fair point. One difficult part of our legislative journey with the Bill is to get right, in so far as we can, the balance between what the regulator should take responsibility for, what Ministers should take responsibility for and what the legislature—this Parliament—should take responsibility for. We may not have got that exactly right yet.
On my hon. Friend’s specific point, my understanding is that because Ofcom must report to Parliament in any event, it will certainly be Ofcom’s intention to report back on this. It will be quite a large slice of what Ofcom does from this point onwards, so it would be remarkable if it did not, but I think we will have to return to the points that my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and others have made about the nature of parliamentary scrutiny that is then required to ensure that we are all on top of this progress as it develops.
I was talking about what I would like my hon. Friend the Minister to say when he winds up the debate. I know he will not have a huge amount of time to do so, but he might also confirm that the balancing duties in relation to freedom of speech and privacy, for example, continue to apply to the fulfilment of the safety duties in this context as well. That would be helpful.
The Government amendments in lieu do not replicate the reference to design in the safety duties themselves, but I do not see that as problematic because, as I understand it, the risks identified in the risk assessment process, which will now include design risks, feed through to and give rise to the safety duties, so that if a design risk is identified in the risk assessment, a service is required to mitigate and address it. Again, I would be grateful if the Minister confirmed that.
We should also recognise that Government amendment (b) in lieu of Lords amendment 17 and Government amendments (b) and (c) in lieu of Lords amendment 81 specifically require consideration of
“functionalities or other features of the service that affect how much children use the service”
As far as I can tell, that introduces consideration of design-related addiction—recognisable to many parents; it cannot just be me—into the assessment process. These changes reflect the reality of how online harm to children manifests itself, and the Government are to be congratulated on including them, although, as I say, the Government and, subsequently, Ofcom will need to be clear about what these new expectations mean in practical terms for a platform considering its risk assessment process and seeking to comply with its safety duties.
I now turn to the amendments dealing with the categorisation process, which are Lords amendment 391 and the Government amendments arising from it. Lords amendment 391 would allow Ofcom to designate a service as a category 1 service, with the additional expectations and responsibility that brings, if it is of a certain scale or if it has certain functionalities, rather than both being required as was the case in the original Bill. The effect of the original drafting was, in essence, that only big platforms could be category 1 platforms and that big platforms were bound to be category 1 platforms. That gave rise to two problems that, as my hon. Friend the Minister knows, we have discussed before.
I do not think I need to respond to that, but it goes to show does it not?
My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.
My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.
The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.
My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.
It has just crossed my mind that the Minister might be saying that he agreed with everything that I said, which cannot be right. Let me be clear about the two points. One was in relation to whether, when we look at design harms, both proportionality and balancing duties are relevant—I think that he is saying yes to both. The other point that I raised with him was around encryption, and whether I put it in the right way in terms of the Government’s position on encryption. If he cannot deal with that now, and I would understand if he cannot, will he write to me and set out whether that is the correct way to see it?
I thank my right hon. Friend for that intervention. Indeed, end-to-end encrypted services are in the scope of the Bill. Companies must assess the level of risk and meet their duties no matter what their design is.