Online Safety Bill Debate
Full Debate: Read Full DebateLord Parkinson of Whitley Bay
Main Page: Lord Parkinson of Whitley Bay (Conservative - Life peer)Department Debates - View all Lord Parkinson of Whitley Bay's debates with the Department for Digital, Culture, Media & Sport
(1 year, 5 months ago)
Lords ChamberWe began this group on the previous day on Report, and I concluded my remarks, so it is now for other noble Lords to contribute on the amendments that I spoke to on Thursday.
My Lords, I rise emphatically to welcome the government amendments in this group. They are a thoughtful and fulsome answer to the serious concerns expressed from the four corners of the Chamber by a great many noble Lords at Second Reading and in Committee about the treatment of age verification for pornography and online harms. For this, I express my profound thanks to my noble friend the Minister, the Secretary of State, the Bill team, the Ofcom officials and all those who have worked so hard to refine this important Bill. This is a moment when the legislative team has clearly listened and done everything it possibly can to close the gap. It is very much the House of Lords at its best.
It is worth mentioning the exceptionally broad alliance of noble Lords who have worked so hard on this issue, particularly my compadres, my noble friend Lady Harding, the noble Baroness, Lady Kidron, and the right reverend Prelate the Bishop of Oxford, who all signed many of the draft amendments. There are the Front-Benchers, including the noble Lords, Lord Stevenson, Lord Knight, Lord Clement-Jones and Lord Allan of Hallam, and the noble Baroness, Lady Merron. There are the Back-Benchers behind me, including my noble friends Lady Jenkin and Lord Farmer, the noble Lords, Lord Morrow, Lord Browne and Lord Dodds, and the noble Baroness, Lady Foster. Of those in front of me, there are the noble Baronesses, Lady Benjamin and Lady Ritchie, and there is also a number too large for me to mention, from all across the House.
I very much welcome the sense of pragmatism and proportionality at the heart of the Online Safety Bill. I welcome the central use of risk assessment as a vital tool for policy implementation and the recognition that some harms are worse than others, that some children need more protection than others, that we are legislating for future technologies that we do not know much about and that we must engage industry to achieve effective implementation. As a veteran of the Communications Act 2003, I strongly support the need for enabling legislation that has agility and a broad amount of support to stand the test of time.
My Lords, this has been a good debate, perhaps unfairly curtailed in terms of the range of voices we have heard, but I am sure the points we wanted to have on the table are there and we can use them in summarising the debate we have had so far.
I welcome the Government’s amendments in this group. They have gone a long way to resolving a number of the difficulties that were left after the Digital Economy Act. As the noble Lord, Lord Clement-Jones, has said, we now have Part 3 and Part 5 hooked together in a consistent and effective way and definitions of “age verification” and “age estimation”. The noble Lord, Lord Grade, is sadly not in his place today—I normally judge the quality of the debate by the angle at which he resides in that top corner there. He is not here to judge it, but I am sure he would be upright and very excited by what we have been hearing so far. His point about the need for companies to be clearly responsible for what they serve up through their services is really important in what we are saying here today.
However, despite the welcome links across to the ICO age-appropriate design code, with the concerns we have been expressing on privacy there are still a number of questions which I think the Minister will want to deal with, either today or in writing. Several noble Lords have raised the question of what “proportionate” means in this area. I have mentioned it in other speeches in other groups. We all want the overall system to be proportionate in the way in which it allocates the powers, duties and responsibilities on the companies providing us with the services they do. But there is an exception for the question of whether children should have access to material which they should not get because of legal constraints, and I hope that “proportionate” is not being used in any sense to evade that.
I say that particularly because the concern has been raised in other debates—and I would be grateful if the Minister could make sure when he comes to respond that this issue is addressed—that smaller companies with less robust track records in terms of their income and expenditures might be able to plead that some of the responsibilities outlined in this section of the Bill do not apply to them because otherwise it would bear on their ability to continue. That would be a complete travesty of where we are trying to get to here, which is an absolute bar on children having access to material that is illegal or in the lists now in the Bill in terms of priority content.
The second worry that people have raised is: will the system that is set up here actually work in practice, particularly if it does not apply to all companies? That relates perhaps to the other half of the coin that I have just mentioned.
The third point, raised by a number of Peers, is: where does all this sit in relation to the review of pornography which was announced recently? A number of questions have been asked about issues which the Minister may be unable to respond to, but I suspect he may also want to write to us on the wider issue of timing and the terms of reference once they are settled.
I think we need to know this as we reach the end of the progress on this Bill, because you cannot expect a system being set up with the powers that are being given to Ofcom to work happily and well if Ofcom knows it is being reviewed at the same time. I hope that some consideration will be given to how we get the system up and running, even if the timescale is now tighter than it was, if at the same time a review rightly positioned to try to look at the wider range of pornography is going to impact on its work.
I want to end on the question raised by a large number of noble Lords: how does all this work sit with privacy? Where information and data are being shared on the basis of assuring access to services, there will be a worry if privacy is not ensured. The amendments tabled by the noble Baroness, Lady Kidron, are very salient to this. I look forward to the Minister’s response to them.
My Lords, I am sorry that the noble Baroness, Lady Benjamin, was unable to be here for the start of the debate on Thursday and therefore that we have not had the benefit of hearing from her today. I am very glad that she was here to hear the richly deserved plaudits from across the House for her years of campaigning on this issue.
I am very glad to have had the opportunity to discuss matters directly with her including, when it was first announced, the review that we have launched. I am pleased that she gave it a conditional thumbs up. Many of her points have been picked up by other noble Lords today. I did not expect anything more than a conditional thumbs up from her, given her commitment to getting this absolutely right. I am glad that she is here to hear some of the answers that I am able to set out, but I know that our discussions would have continued even if she had been able to speak today and that her campaigns on this important issue will not cease; she has been tireless in them. I am very grateful to her, my noble friends Lord Bethell and Lady Harding, the noble Baroness, Lady Kidron, and many others who have been working hard on this.
Let me pick up on their questions and those of the noble Baroness, Lady Ritchie of Downpatrick, and others on the review we announced last week. It will focus on the current regulatory landscape and how to achieve better alignment of online and offline regulation of commercial pornography. It will also look at the effectiveness of the criminal law and the response of the criminal justice system relating to pornography. This would focus primarily on the approach taken by law enforcement agencies and the Crown Prosecution Service, including considering whether changes to the criminal law would address the challenges identified.
The review will be informed by significant expert input from government departments across Whitehall, the Crown Prosecution Service and law enforcement agencies, as well as through consultation with the industry and with civil society organisations and regulators including, as the noble Baroness, Lady Ritchie, rightly says, some of the many NGOs that do important work in this area. It will be a cross-government effort. It will include but not be limited to input from the Ministry of Justice, the Home Office, the Department for Science, Innovation and Technology and my own Department for Culture, Media and Sport. I assure my noble friend Lord Farmer that other government departments will of course be invited to give their thoughts. It is not an exhaustive list.
I detected the enthusiasm for further details from noble Lords across the House. I am very happy to write as soon as I have more details on the review, to keep noble Lords fully informed. I can be clear that we expect the review to be complete within 12 months. The Government are committed to undertaking it in a timely fashion so that any additional safeguards for protecting UK users of online services can be put in place as swiftly as possible.
My noble friend Lord Bethell asked about international alignment and protecting Britain for investment. We continue to lead global discussions and engagement with our international partners to develop common approaches to online safety while delivering on our ambition to make the UK the safest place in the world to be online.
The noble Baroness, Lady Kidron, asked about the new requirements. They apply only to Part 3 providers, which allow pornography or other types of primary priority content on their service. Providers that prohibit this content under their terms of service for all users will not be required to use age verification or age estimation. In practice, we expect services that prohibit this content to use other measures to meet their duties, such as effective content moderation and user reporting. This would protect children from this content instead of requiring measures that would restrict children from seeing content that is not allowed on the service in the first place.
These providers can still use age verification and age estimation to comply with the existing duty to prevent children encountering primary priority content. Ofcom can still recommend age-verification and age-estimation measures in codes of practice for these providers where proportionate. On the noble Baroness’s second amendment, relating to Schedule 4, Ofcom may refer to the age-assurance principles set out in Schedule 4 in its children’s codes of practice.
On the 18-month timetable, I can confirm that 18 months is a backstop and not a target. Our aim is to have the regime in force as quickly as possible while making sure that services understand their new duties. Ofcom has set out in its implementation road map that it intends to publish draft guidance under Part 5 this autumn and draft children’s codes next spring.
The noble Baroness, Lady Ritchie, also asked about implementation timetables. I can confirm that Part 3 and Part 5 duties will be implemented at the same time. Ofcom will publish draft guidance shortly after Royal Assent for Part 5 duties and codes for the illegal content duties in Part 3. Draft codes for Part 3 children’s duties will follow in spring next year. Some Part 3 duties relating to category 1 services will be implemented later, after the categorisation thresholds have been set in secondary legislation.
The noble Lord, Lord Allan of Hallam, asked about interoperability. We have been careful to ensure that the Bill is technology neutral and to allow for innovation across the age-assurance market. We have also included a principle on interoperability in the new list of age-assurance principles in Schedule 4 and the Part 5 guidance.
At the beginning of the debate, on the previous day on Report, I outlined the government amendments in this group. There are some others, which noble Lords have spoken to. Amendments 125 and 217, from the noble Baroness, Lady Kidron, seek to add additional principles on user privacy to the new lists of age-assurance principles for both Part 3 and 5, which are brought in by Amendments 124 and 216. There are already strong safeguards for user privacy in the Bill. Part 3 and 5 providers will need to have regard to the importance of protecting users’ privacy when putting in place measures such as age verification or estimation. Ofcom will be required to set out, in codes of practice for Part 3 providers and in guidance for Part 5 providers, how they can meet these duties relating to privacy. Furthermore, companies that use age-verification or age-estimation solutions will need to comply with the UK’s robust data protection laws or face enforcement action.
Adding the proposed new principles would, we fear, introduce confusion about the nature of the privacy duties set out in the Bill. Courts are likely to assume that the additions are intended to mean something different from the provisions already in the Bill relating to privacy. The new amendments before your Lordships imply that privacy rights are unqualified and that data can never be used for more than one purpose, which is not the case. That would introduce confusion about the nature of—
My Lords, I apologise to the Minister. Can he write giving chapter and verse for that particular passage by reference to the contents of the Bill?
I am very happy to do that. That would probably be better than me trying to do so at length from the Dispatch Box.
Government Amendment 124 also reinforces the importance of protecting children’s privacy, including data protection, by ensuring that Ofcom will need to have regard to standards set out under Section 123 of the Data Protection Act 2018 in the age-appropriate design code. I hope that explains why we cannot accept Amendments 125 or 217.
The noble Baroness, Lady Fox, has Amendment 184 in this group and was unable to speak to it, but I am very happy to respond to it and the way she set it out on the Marshalled List. It seeks to place a new duty on Ofcom to evaluate whether internet service providers, internet-connected devices or individual websites should undertake user-identification and age-assurance checks. This duty would mean that such an evaluation would be needed before Ofcom produces guidance for regulated services to meet their duties under Clauses 16 and 72.
Following this evaluation, Ofcom would need to produce guidance on age-verification and age-assurance systems, which consider cybersecurity and a range of privacy considerations, to be laid before and approved by Parliament. The obligation for Ofcom to evaluate age assurance, included in the noble Baroness’s amendment, is already dealt with by Amendment 271, which the Government have tabled to place a new duty on Ofcom to publish a report on the effectiveness of age-assurance solutions. That will specifically include consideration of cost to business, and privacy, including the processing of personal data.
I just realised I forgot to thank the Government for Amendment 271, which reflected something I raised in Committee. I will reflect back to the Minister that, as is reinforced by his response now, it goes precisely where I wanted to. That is to make sure—I have raised this many times—that we are not implementing another cookie banner, but are implementing something and then going back to say, “Did it work as we intended? Were the costs proportionate to what we achieved?” I want to put on the record that I appreciate Amendment 271.
I appreciate the noble Lord’s interjection and, indeed, his engagement on this issue, which has informed the amendments that we have tabled.
In relation to the amendment of the noble Baroness, Lady Fox, as I set out, there are already robust safeguards for user privacy in the Bill. I have already mentioned Amendment 124, which puts age-assurance principles in the Bill. These require Ofcom to have regard, when producing its codes of practice on the use of age assurance, to the principle of protecting the privacy of users, including data protection. We think that the noble Baroness’s amendment is also unnecessary. I hope that she and the noble Baroness, Lady Kidron, will be willing to not move their amendments and to support the government amendments in the group.
There is always a simple question. We are in a bit of a mess—again. When I said at Second Reading that I thought we should try to work together, as was picked up by the noble Baroness in her powerful speech, to get the best Bill possible out of what we had before us, I really did not know what I was saying. Emotion caught me and I ripped up a brilliant speech which will never see the light of day and decided to wing it. I ended up by saying that I thought we should do the unthinkable in this House—the unthinkable in politics, possibly—and try to work together to get the Bill to come right. As the noble Lord, Lord Clement-Jones, pointed out, I do not think I have ever seen, in my time in this House, so many government amendments setting out a huge number of what we used to call concessions. I am not going to call them concessions—they are improvements to the Bill. We should pay tribute to the Minister, who has guided his extensive team, who are listening anxiously as we speak, in the good work they have been doing for some time, getting questioned quite seriously about where it is taking us.
The noble Lord, Lord Clement-Jones, is quite right to pick up what the pre-legislative scrutiny committee said about this aspect of the work we are doing today and what is in the Bill. We have not really nailed the two big things that social media companies ask: this amplification effect, where a single tweet—or thread, let us call it now—can go spinning around the world and gather support, comment, criticism, complaint, anger and all sorts of things that we probably do not really understand in the short period of time it takes to be read and reacted to. That amplification is not something we see in the real world; we do not really understand it and I am not quite sure we have got to the bottom of where we should be going at this stage.
The second most important point—the point we are stuck on at the moment; this rock, as it were, in the ocean—is the commercial pressure which, of course, drives the way in which companies operate. They are in it for the money, not the social purpose. They did not create public spaces for people to discuss the world because they think it is a good thing. There is no public service in this—this is a commercial decision to get as much money as possible from as many people as possible and, boy, are they successful.
But commercial pressures can have harms; they create harms in ways that we have discussed, and the Bill reflects many of those. This narrow difference between the way the Bill describes content, which is meant to include many of the things we have been talking about today—the four Cs that have been brought into the debate helpfully in recent months—does not really deal with the commercial pressures under which people are placed because of the way in which they deal with social media. We do not think the Bill is as clear as it could be; nor does it achieve as much as it should in trying to deal with that issue.
That is in part to do with the structure. It is almost beyond doubt that the sensibility of what we are trying to achieve here is in the Bill, but it is there at such a level of opacity that it does not have the clarity of the messages we have heard today from those who have spoken about individuals—Milly and that sort of story—and the impact on people. Even the noble Lord, Lord Bethell, whose swimming exploits we must admire, is an unwitting victim of the drive of commercial pressures that sees him in his underwear at inappropriate moments in order that they should seek the profits from that. I think it is great, but I wonder why.
I want to set the Minister a task: to convince us, now that we are at the bar, that when he says that this matter is still in play, he realises what that must imply and will give us a guarantee that we will be able to gain from the additional time that he seeks to get this to settle. There is a case, which I hope he will agree to, for having in the Bill an overarching statement about the need to separate out the harms that arise from content and the harms that arise from the system discussions and debates we have been having today where content is absent. I suggest that, in going back to Clause 1, the overarching objectives clause, it might well be worth seeing whether that might be strengthened so that it covers this impact, so that the first thing to read in the Bill is a sense that we embrace, understand and will act to improve this question of harm arising absent content. There is a case for putting into Clauses 10, 11, 25 and 82 the wording in Amendments 35, 36, 37A and 240, in the name of the noble Baroness, Lady Kidron, and to use those as a way of making sure that every aspect of the journey through which social media companies must go to fulfil the duties set out in the Bill by Ofcom reflects both the content that is received and the design choices made by those companies in bringing forward those proposals for material content harms and the harms that arise from the design choices. Clauses 208 and 209 also have to provide a better consideration of how one describes harms so that they are not always apparently linked to content.
That is a very high hurdle, particularly because my favourite topic of how this House works will be engaged. We have, technically, already passed Clause 1; an amendment was debated and approved, and now appears in versions of the Bill. We are about to finish with Clauses 10 and 11 today, so we are effectively saying to the Minister that he must accept that there are deficiencies in the amendments that have already been passed or would be, if we were to pass Amendments 35, 36, 37A, 85 and 240 in the name of the noble Baroness, Lady Kidron, and others. It is not impossible, and I understand that it would be perfectly reasonable, for the Government to bring back a series of amendments on Third Reading reflecting on the way in which the previous provisions do not fulfil the aspirations expressed all around the House, and therefore there is a need to change them. Given the series of conversations throughout this debate—my phone is red hot with the exchanges taking place, and we do not have a clear signal as to where that will end up—it is entirely up to the Minister to convince the House whether these discussions are worth it.
To vote on this when we are so close seems ridiculous, because I am sure that if there is time, we can make this work. But time is not always available, and it will be up to the Minister to convince us that we should not vote and up to the noble Baroness to decide whether she wishes to test the opinion of the House. We have a three-line Whip on, and we will support her. I do not think that it is necessary to vote, however—we can make this work. I appeal to the Minister to get over the bar and tell us how we are to do it.
My Lords, I am very grateful for the discussion we have had today and the parallel discussions that have accompanied it, as well as the many conversations we have had, not just over the months we have been debating the Bill but over the past few days.
I will turn in a moment to the amendments which have been the focus of the debate, but let me first say a bit about the amendments in this group that stand in my name. As noble Lords have kindly noted, we have brought forward a number of changes, informed by the discussions we have had in Committee and directly with noble Lords who have taken an interest in the Bill for a long time.
Government Amendments 281C, 281D, 281E and 281G relate to the Bill’s interpretation of “harm”, which is set out in Clause 209. We touched on that briefly in our debate on Thursday. The amendments respond to concerns which I have discussed with many across your Lordships’ House that the Bill does not clearly acknowledge that harm and risk can be cumulative. The amendments change the Bill to make that point explicit. Government Amendment 281D makes it clear that harm may be compounded in instances where content is repeatedly encountered by an individual user. That includes, but is not limited to, instances where content is repeatedly encountered as a result of algorithms or functionalities on a service. Government Amendment 281E addresses instances in which the combination of multiple functionalities on a service cumulatively drives up the risk of harm.
Those amendments go hand in hand with other changes that the Government have made on Report to strengthen protections for children. Government Amendment 1, for instance, which we discussed at the beginning of Report, makes it clear that services must be safe by design and that providers must tackle harms which arise from the design and operation of their service. Government Amendments 171 and 172 set out on the face of the Bill the categories of “primary priority” and “priority” content which is harmful to children to allow the protections for children to be implemented as swiftly as possible following Royal Assent. As these amendments demonstrate, the Government have indeed listened to concerns which have been raised from all corners of your Lordships’ House and made significant changes to strengthen the Bill’s protections for children. I agree that it has been a model of the way in which your Lordships’ House operates, and the Bill has benefited from it.
Let me turn to the amendments in the name of the noble Baroness, Lady Kidron. I am very grateful for her many hours of discussion on these specific points, as well as her years of campaigning which led to them. We have come a long way and made a lot of progress on this issue since the discussion at the start of Committee. The nature of online risk versus harm is one which we have gone over extensively. I certainly accept the points that the noble Baroness makes; I know how heartfelt they are and how they are informed by her experience sitting in courtrooms and in coroners’ inquests and talking to people who have had to be there because of the harms they or their families have encountered online. The Government are firmly of the view that it is indisputable that a platform’s functionalities, features or wider design are often the single biggest factor in determining whether a child will suffer harm. The Bill makes it clear that functions, features and design play a key role in the risk of harm occurring to a child online; I draw noble Lords’ attention to Clause 11(5), which makes it clear that the child safety duties apply across all areas of a service, including the way it is designed, operated and used, as well as content present on the service. That makes a distinction between the design, operation and use, and the content.
In addition, the Bill’s online safety objectives include that regulated services should be designed and operated so as to protect from harm people in the United Kingdom who are users of the service, including with regard to algorithms used by the service, functionalities of the services and other features relating to the operation of the service. There is no reference to content in this section, again underlining that the Bill draws a distinction.
This ensures that the role of functionalities is properly accounted for in the obligations on providers and the regulator, but I accept that noble Lords want this to be set out more clearly. Our primary aim must be to ensure that the regulatory framework can operate as intended, so that it can protect children in the way that they deserve and which we all want to see. Therefore, we cannot accept solutions that, however well meaning, may inadvertently weaken the Bill’s framework or allow providers to exploit legal uncertainty to evade their duties. We have come back to that point repeatedly in our discussions.
My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.
For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers
“to operate a service using proportionate systems and processes”
to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.
The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.
The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that
“the size and capacity of the provider of a service”
is relevant
“in determining what is proportionate”.
The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.
Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.
I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?
Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.
The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.
My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.
I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.
Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.
The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.
The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.
While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.
I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.
My Lords, I thank my noble friend the Minister for that reassurance. He put the points extremely well. I very much welcome his words from the Dispatch Box, which go a long way towards clarifying and reassuring.
This was a short and perfectly formed debate. I will not go on a tour d’horizon of everyone who has spoken but I will mention the noble Lord, Lord Allan of Hallam. He is entirely right that no one wants gratuitously to hound out businesses from the UK that contribute to the economy and to our life here. There are good regulatory principles that should be applied by all regulators. The five regulatory principles of accountability, transparency, targeting, consistency and proportionality are all in the Legislative and Regulatory Reform Act 2006. Ofcom will embrace them and abide by them. That kind of reassurance is important to businesses as they approach the new regulatory regime.
I take on board what my noble friend the Minister said in terms of the application of regulations regardless of size or capacity, and the application of these strengthened duties, such as “highly effective”, regardless of any economic or financial capacity. I feel enormously reassured by what he has said. I beg leave to withdraw my amendment.
It is always nice to be nice to the Minister.
I will reference, briefly, the introduction of the amendments in the name of the noble Baroness, Lady Fraser of Craigmaddie, which I signed. They were introduced extremely competently, as you would expect, by my noble and learned kinsman Lord Hope. It is important to get the right words in the right place in Bills such as this. He is absolutely right to point out the need to be sure that we are talking about the right thing when we say “freedom of expression”—that we do mean that and not “freedom of speech”; we should not get them mixed up—and, also, to have a consistent definition that can be referred to, because so much depends on it. Indeed, this group might have run better and more fluently if we had started with this amendment, which would have then led into the speeches from those who had the other amendments in the group.
The noble Baroness is not present today, but not for bad news: for good news. Her daughter is graduating and she wanted to be present at that; it is only right that she should do that. She will be back to pick up other aspects of the devolution issues she has been following very closely, and I will support her at that time.
The debate on freedom of expression was extremely interesting. It raised issues that, perhaps, could have featured more fully had this been timetabled differently, as both noble Lords who introduced amendments on this subject said. I will get my retaliation in first: a lot of what has been asked for will have been done. I am sure that the Minister will say that, if you look at the amendment to Clause 1, the requirement there is that freedom of expression is given priority in the overall approach to the Bill, and therefore, to a large extent, the requirement to replace that at various parts of the Bill may not be necessary. But I will leave him to expand on that; I am sure that he will.
Other than that, the tension I referred to in an earlier discussion, in relation to what we are made to believe about the internet and the social media companies, is that we are seeing a true public square, in which expressions and opinions can be exchanged as freely and openly as they would be in a public space in the real world. But, of course, neither of those places really exists, and no one can take the analogy further than has been done already.
The change, which was picked up by the noble Baroness, Lady Stowell, in relation to losing “legal but harmful”, has precipitated an issue which will be left to social media companies to organise and police—I should have put “policing” in quotation marks. As the noble Baroness, Lady Kidron, said, the remedy for much of this will be an appeals mechanism that works both at the company level and for the issues that need rebalancing in relation to complexity or because they are not being dealt with properly. We will not know that for a couple of years, but at least that has been provided for and we can look forward to it. I look forward to the Minister’s response.
My Lords, I hope that the noble Baroness, Lady Fox, and my noble friend Lord Moylan do feel that they have been listened to. It was striking, in this debate, that they had support from all corners of your Lordships’ House. I know that, at various points in Committee, they may have felt that they were in a minority, but they have been a very useful and welcome one. This debate shows that many of the arguments that they have made throughout the passage of the Bill have resonated with noble Lords from across the House.
Although I have not signed amendments in the names of the noble Baroness and my noble friend Lord Moylan, in many cases it is not because I disagree with them but because I think that what they do is already covered in the Bill. I hope to reassure them of that in what I say now.
Amendments 77 to 81 from the noble Baroness, Lady Fox, would require services to have particular regard to freedom of expression and privacy when deciding on their terms of service. Services will already need to have particular regard to users’ rights when deciding on safety systems to fulfil their duties. These requirements will be reflected in providers’ terms of service, as a result of providers’ duties to set out their safety measures in their terms of service. The framework will also include a range of measures to allow scrutiny of the formulation, clarity and implementation of category 1 providers’ own terms of service.
However, there are some points on which we disagree. For instance, we do not think that it would be appropriate for all providers to have a general duty to have a particular regard to freedom of expression when deciding on their own terms of service about content. We believe that the Bill achieves the right balance. It requires providers to have regard to freedom of expression when carrying out their safety duties, and it enables public scrutiny of terms of service, while recognising providers’ own freedom of expression rights as private entities to set the terms of service that they want. It is of course up to adults to decide which services to use based on the way those services are drawn up and the way the terms of service set out what is permissible in them.
Nothing in the Bill restricts service providers’ ability to set their own terms and conditions for legal content accessed by adults—that is worth stressing. Ofcom will not set platforms’ terms and conditions, nor will it take decisions on whether individual pieces of content should, or should not, be on a platform. Rather, it will ensure that platforms set clear terms and conditions, so that adults know what to expect online, and ensure that platforms have systems and processes in place to enforce those terms and conditions themselves.
Amendment 226 from the noble Baroness, Lady Fox, would require providers to use all relevant information that is reasonably available to them whenever they make judgments about content under their terms of service. That is, where they have included or drafted those terms of service in compliance with duties in the Bill. Her amendment would be to an existing requirement in Clause 173, which already requires providers to take this approach whenever they implement a system or process to comply, and this system is making judgments about certain content. For example, Clause 173 already covers content judgments made via systems and processes that a category 1 provider implements to fulfil its Clause 65 duties to enforce its own terms of service consistently. So we feel that Clause 173 is already broad enough to achieve the objectives that the noble Baroness, Lady Fox, seeks.
My noble friend Lord Moylan’s amendments seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties and when drafting codes or guidance. As we discussed in Committee, Ofcom has existing obligations to protect freedom of expression, and the Bill will include additional measures in this regard. We are also making additional amendments to underline the importance of freedom of expression. I am grateful to the noble and learned Lord, Lord Hope of Craighead, and my noble friend Lady Fraser of Craigmaddie for their work to define “freedom of expression” in the Bill. The Bill’s new overarching statement at Clause 1, as the noble Lord, Lord Stevenson, rightly pointed out, lists “freedom of expression”, signalling that it is a fundamental part of the Bill. That is a helpful addition.
Amendment 188 in the name of the noble Baroness, Lady Fox, seeks to disapply platforms’ Clause 65 duties when platforms’ terms of service restrict lawful expression, or expression otherwise protected by Article 10 of the European Convention on Human Rights. Her amendment would mean that category 1 providers’ Clause 65 duties to enforce clear, accessible terms of service in a consistent manner would not apply to any of their terms of service, where they are making their own decisions restricting legal content. That would greatly undermine the application of these provisions in the Bill.
Article 10 of the European Convention on Human Rights concerns individuals’ and entities’ rights to receive and impart ideas without undue interference by public authorities, not private entities. As such, it is not clear how a service provider deciding not to allow a certain type of content on its platform would engage the Article 10 rights of a user.
Beyond the legal obligations regarding the treatment of certain kinds of user-generated content imposed by this Bill and by other legislation, platforms are free to decide what content they wish, or do not wish, to have on their services. Provisions in the Bill will set out important duties to ensure that providers’ contractual terms on such matters are clear, accessible and consistently enforced.
My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?
As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights
“to receive and impart ideas without undue interference”
by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.
On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.