(1 year, 6 months ago)
Lords ChamberMy Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.
My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.
I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.
I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.
One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.
We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.
With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?
The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.
It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.
I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.
My Lords, on day eight of Committee, I feel that we have all found our role. Each of us has spoken in a similar vein on a number of amendments, so I will try to be brief. As the noble Lord, Lord Allan, has spoken from his experience, I will once again reference my experience as the chief executive, for seven years, of a business regulated by Ofcom; as the chair of a regulator; and as someone who sat on the court of, arguably, the most independent of independent regulators, the Bank of England, for eight years.
I speak in support of the amendments in the name of my noble friend Lady Stowell, because, as a member of the Communications and Digital Committee, my experience, both of being regulated and as a regulator, is that independent regulators might be independent in name—they might even be independent in statute—but they exist in the political soup. It is tempting to think that they are a sort of granite island, completely immovable in the political soup, but they are more like a boat bobbing along in the turbulence of politics.
As the noble Lord, Lord Allan, has just described, they are influenced both overtly and subtly by the regulated companies themselves—I am sure we have both played that game—by politicians on all sides, and by the Government. We have played these roles a number of times in the last eight days; however, this is one of the most important groups of amendments, if we are to send the Bill back in a shape that will really make the difference that we want it to. This group of amendments challenges whether we have the right assignment of responsibility between Parliament, the regulator, government, the regulated and citizens.
It is interesting that we—every speaker so far—are all united that the Bill, as it currently stands, does not get that right. To explain why I think that, I will dwell on Amendment 114 in the name of my noble friend Lady Stowell. The amendment would remove the Secretary of State’s ability to direct Ofcom to modify a draft of the code of practice “for reasons of public policy”. It leaves open the ability to direct in the cases of terrorism, child sexual abuse, national security or public safety, but it stops the Secretary of State directing with regard to public policy. The reason I think that is so important is that, while tech companies are not wicked and evil, they have singularly failed to put internet safety, particularly child internet safety, high enough up their pecking order compared with delivering for their customers and shareholders. I do not see how a Secretary of State will be any better at that.
Arguably, the pressures on a Secretary of State are much greater than the pressures on the chief executives of tech companies. Secretaries of State will feel those pressures from the tech companies and their constituents lobbying them, and they will want to intervene and feel that they should. They will then push that bobbing boat of the independent regulator towards whichever shore they feel they need to in the moment—but that is not the way you protect people. That is not the way that we treat health and safety in the physical world. We do not say, “Well, maybe economics is more important than building a building that’s not going to fall down if we have a hurricane”. We say that we need to build safe buildings. Some 200 years ago, we were having the same debates about the physical world in this place; we were debating whether you needed to protect children working in factories, and the consequences for the economics. Well, how awful it is to say that today. That is the reality of what we are saying in the Bill now: that we are giving the Secretary of State the power to claim that the economic priority is greater than protecting children online.
I am starting to sound very emotional because at the heart of this is the suggestion that we are not taking the harms seriously enough. If we really think that we should be giving the Secretary of State the freedom to direct the regulator in such a broad way, we are diminishing the seriousness of the Bill. That is why I wholeheartedly welcome the remark from the noble Lord, Lord Stevenson, that he intends to bring this back with the full force of all of us across all sides of the Committee, if we do not hear some encouraging words from my noble friend the Minister.
My Lords, it is pleasure to follow the noble Baroness, Lady Harding, whose very powerful speech took us to the heart of the principles behind these amendments. I will add my voice, very briefly, to support the amendments for all the key reasons given. The regulator needs to be independent of the Secretary of State and seen to be so. That is the understandable view of the regulator itself, Ofcom; it was the view of the scrutiny committee; and it appears to be the view of all sides and all speakers in this debate. I am also very supportive of the various points made in favour of the principle of proper parliamentary scrutiny of the regulator going forward.
One of the key hopes for the Bill, which I think we all share, is that it will help set the tone for the future global conversation about the regulation of social media and other channels. The Government’s own impact assessment on the Bill details parallel laws under consideration in the EU, France, Australia, Germany and Ireland, and the noble Viscount, Lord Colville, referred to standards set by UNESCO. The standards set in the OSB at this point will therefore be a benchmark across the world. I urge the Government to set that benchmark at the highest possible level for the independence and parliamentary oversight of the regulator.
My Lords, the amendments concern the independence of Ofcom and the role of parliamentary scrutiny. They are therefore indeed an important group, as those things will be vital to the success of the regime that the Bill sets up. Introducing a new, ground-breaking regime means balancing the need for regulatory independence with a transparent system of checks and balances. The Bill therefore gives powers to the Secretary of State comprising a power to direct Ofcom to modify a code of practice, a power to issue a statement of strategic priorities and a power to issue non-binding guidance to the regulator.
These powers are important but not novel; they have precedent in the Communications Act 2003, which allows the Secretary of State to direct Ofcom in respect of its network and spectrum functions, and the Housing and Regeneration Act 2008, which allows the Secretary of State to make directions to the Regulator of Social Housing to amend its standards. At the same time, I agree that it is important that we have proportionate safeguards in place for the use of these powers, and I am very happy to continue to have discussions with noble Lords to make sure that we do.
Amendment 110, from the noble Lord, Lord Stevenson, seeks to introduce a lengthier process regarding parliamentary approval of codes of practice, requiring a number of additional steps before they are laid in Parliament. It proposes that each code may not come into force unless accompanied by an impact assessment covering a range of factors. Let me reassure noble Lords that Ofcom is already required to consider these factors; it is bound by the public sector equality duty under the Equality Act 2010 and the Human Rights Act 1998 and must ensure that the regime and the codes of practice are compliant with rights under the European Convention on Human Rights. It must also consult experts on matters of equality and human rights when producing its codes.
Amendment 110 also proposes that any designated Select Committee in either House has to report on each code and impact assessment before they can be made. Under the existing process, all codes must already undergo scrutiny by both Houses before coming into effect. The amendment would also introduce a new role for the devolved Administrations. Let me reassure noble Lords that the Government are working closely with them already and will continue to do so over the coming months. As set out in Schedule 5 to the Scotland Act 1998, however, telecommunications and thereby internet law and regulation is a reserved policy area, so input from the devolved Administrations may be more appropriately sought through other means.
Amendments 111, 113, 114, 115, and 117 to 120 seek to restrict or remove the ability of the Secretary of State to issue directions to Ofcom to modify draft codes of practice. Ofcom has great expertise as a regulator, as noble Lords noted in this debate, but there may be situations where a topic outside its remit needs to be reflected in a code of practice. In those situations, it is right for the Government to be able to direct Ofcom to modify a draft code. This could, for example, be to ensure that a code reflects advice from the security services, to which Ofcom does not have access. Indeed, it is particularly important that the Secretary of State be able to direct Ofcom on matters of national security and public safety, where the Government will have access to information which Ofcom will not.
I have, however, heard the concerns raised by many in your Lordships’ House, both today and on previous occasions, that these powers could allow for too much executive control. I can assure your Lordships that His Majesty’s Government are committed to protecting the regulatory independence of Ofcom, which is vital to the success of the framework. With this in mind, we have built a number of safeguards into the use of the powers, to ensure that they do not impinge on regulatory independence and are used only in limited circumstances and for the appropriate reasons.
I have heard the strong feelings expressed that this power must not unduly restrict regulatory independence, and indeed share that feeling. In July, as noble Lords noted, the Government announced our intention to make substantive changes to the power; these changes will make it clear that the power is for use only in exceptional circumstances and will replace the “public policy” wording in Clause 39 with a defined list of reasons for which a direction can be made. I am happy to reiterate that commitment today, and to say that we will be making these changes on Report when, as the noble Lord, Lord Clement-Jones, rightly said, noble Lords will be able to see the wording and interrogate it properly.
Additionally, in light of the debate we have just had today—
Can my noble friend the Minister clarify what he has just said? When he appeared in front of the Communications and Digital Committee, I think he might have been road-testing some of that language. In the specific words used, he would still have allowed the Secretary of State to direct Ofcom for economic reasons. Is that likely to remain the case? If it is, I feel it will not actually meet what I have heard is the will of the Committee.
My Lords, I too should have spoken before the noble Lord, Lord Allan; I should have known, given his position on the Front Bench, that he was speaking on behalf of the Liberal Democrats. I was a little reticent to follow him, knowing his expertise in the technical area, but I am very pleased to do so now. I support this very important group of amendments and thank noble Lords for placing them before us. I echo the thanks to all the children’s NGOs that have been working in this area for so long.
For legislators, ambiguity is rarely a friend, and this is particularly true in legislation dealing with digital communications, where, as we all acknowledge, the law struggles to keep pace with technical innovation. Where there is ambiguity, sites will be creative and will evade what they see as barriers—of that I have no doubt. Therefore, I strongly believe that there is a need to have clarity where it can be achieved. That is why it is important to have in the Bill a clear definition of age verification for pornography.
As we have heard this evening, we know that pornography is having a devastating impact on our young people and children: it is impacting their mental health and distorting their views of healthy sexual relationships. It is very upsetting for me that evidence shows that children are replicating the acts they see in pornographic content, thinking that it is normal. It is very upsetting that, in particular, young boys who watch porn think that violence during intimacy is a normal thing to do. The NSPCC has told us that four in 10 boys aged 11 to 16 who regularly view porn say they want to do that because they want to get ideas as to the type of sex they want to try. That is chilling. Even more chilling is the fact that content is often marketed towards children, featuring characters from cartoons, such as “Frozen”, “Scooby Doo” and “The Incredibles”, to try to draw young people on to those sites. Frankly, that is unforgivable; it is why we need robust age verification to protect our children from this content. It must apply to all content, regardless of where it is found; we know, for instance, that Twitter is often a gateway to pornographic sites for young people.
The noble Lord, Lord Bethell, referred to ensuring, beyond all reasonable doubt, that the user is over 18. I know that that is a very high standard—it is the criminal law level—but I believe it is what is needed. I am interested to hear what the Minister has to say about that, because, if we are to protect children and if we take on the role of the fireguard, which the right reverend Prelate referred to, we need to make sure that it is as strong as possible.
Also, this is not just about making sure that users are over 18; we need to make sure that adults, not children, are involved in the content. The noble Baroness, Lady Benjamin, talked about adults being made to look like children, but there is also the whole area of young people being trafficked and abused into pornography production; therefore, Amendment 184 on performer age checks is very important.
I finish by indicating my strong support for Amendment 185 in the name of the noble Baroness, Lady Benjamin. Some, if not most, mainstream pornography content sites are degrading, extremely abusive and violent. Such content would be prohibited in the offline world and is illegal to own and to have; this includes sexual violence including strangulation, incest and sexualising children. We know that this is happening online because, as we have heard, some of the most frequently searched terms on porn sites are “teens”, “schoolgirls” or “girls”, and the lack of regulation online has allowed content to become more and more extreme and abusive. That is why I support Amendment 185 in the name of noble Baroness, Lady Benjamin, which seeks to bring parity between the online and offline regulation of pornographic content.
This Bill has been eagerly awaited. There is no doubt about that. It has been long in the gestation—some people would say too long. We have had much discussion in this Committee but let us get it right. I urge the Minister to take on board the many points made this afternoon. That fireguard needs not only to be put in place, but it needs to be put in place so that it does not move, it is not knocked aside and so that it is at its most effective. I support the amendments.
My Lords, I also failed to stand up before the noble Lord, Lord Allan, did. I too am always slightly nervous to speak before or after him for fear of not having the detailed knowledge that he does. There have been so many powerful speeches in this group. I will try to speak swiftly.
My role in this amendment was predefined for me by the noble Baroness, Lady Kidron, as the midwife. I have spent many hours debating these amendments with my noble friend Lord Bethell, the noble Baroness, Lady Kidron, and with many noble Lords who have already spoken in this debate. I think it is very clear from the debate why it is so important to put a definition of age assurance and age verification on the face of the Bill. People feel so passionately about this subject. We are creating the digital legal scaffolding, so being really clear what we mean by the words matters. It really matters and we have seen it mattering even in the course of this debate.
My two friends—they are my friends—the noble Baroness, Lady Kidron, and my noble friend Lord Bethell both used the word “proportionate”, with one not wanting us to be proportionate and the other wanting us to be proportionate. Yet, both have their names to the same amendment. I thought it might be helpful to explain what I think they both mean—I am sure they will interrupt me if I get this wrong—and explain why the words of the amendment matter so much.
Age assurance should not be proportionate for pornography. It should be the highest possible bar. We should do everything in our power to stop children seeing it, whether it is on a specific porn site or on any other site. We do not want our children to see pornography; we are all agreed on that. There should not be anything proportionate about that. It should be the highest bar. Whether “beyond reasonable doubt” is the right wording or it should instead be “the highest possible bar practically achievable”, I do not know. I would be very keen to hear my noble friend the Minister’s thoughts on what the right wording is because, surely, we are all clear it should be disproportionate; it should absolutely be the hardest we can take.
Equally, age assurance is not just about pornography, as the noble Lord, Lord Allan, has said. We need to have a proportionate approach. We need a ladder where age assurance for pornography sits at the top, and where we are making sure that nine year-olds cannot access social media sites if they are age-rated for 13. We all know that we can go into any primary school classroom in the land and find that the majority of nine year-olds are on social media. We do not have good age assurance further down.
As both the noble Lord, Lord Allan, and the noble Baroness, Lady Kidron, have said, we need age assurance to enable providers to adapt the experience to make it age-appropriate for children on services we want children to use. It needs to be both proportionate and disproportionate, and that needs to be defined on the face of the Bill. If we do not, I fear that we will fall into the trap that the noble Lord, Lord Allan, mentioned: the cookie trap. We will have very well-intentioned work that will not protect children and will go against the very thing that we are all looking for.
In my role as the pragmatic midwife, I implore my noble friend the Minister to hear what we are all saying and to help us between Committee and Report, so that we can come back together with a clear definition of age assurance and age verification on the face of the Bill that we can all support.
My Lords, about half an hour ago I decided I would not speak, but as we have now got to this point, I thought I might as well say what I was going to say after all. I reassure noble Lords that in Committee it is perfectly permissible to speak after the winder, so no one is breaking any procedural convention. That said, I will be very brief.
My first purpose in rising is to honour a commitment I made last week when I spoke against the violence against women and girls code. I said that I would none the less be more sympathetic to and supportive of stronger restrictions preventing child access to pornography, so I want to get my support on the record and honour that commitment in this context.
My noble friend Lady Harding spoke on the last group about bringing our previous experiences to bear when contributing to some of these issues. As I may have said in the context of other amendments earlier in Committee, as a former regulator, I know that one of the important guiding principles is to ensure that you regulate for a reason. It is very easy for regulators to have a set of rules. The noble Baroness, Lady Kidron, referred to rules of the road for the tech companies to follow. It is very easy for regulators to examine whether those rules are being followed and, having decided that they have, to say that they have discharged their responsibility. That is not good enough. There must be a result, an outcome from that. As the noble Lord, Lord Allan, emphasised, this must be about outcomes and intended benefits.
I support making it clear in the Bill that, as my noble friend Lady Harding said, we are trying to prevent, disproportionately, children accessing pornography. We will do all we can to ensure that it happens, and that should be because of the rules being in place. Ofcom should be clear on that. However, I also support a proportionate approach to age assurance in all other contexts, as has been described. Therefore, I support the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell, and the role my noble friend Lady Harding has played in arriving at a pragmatic solution.
My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.
I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.
The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.
I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.
But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.
I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.
On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.
On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.
The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.
I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.
(1 year, 6 months ago)
Lords ChamberIt has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.
This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.
I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.
My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.
I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.
While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.
So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.
For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.
From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.
(1 year, 6 months ago)
Lords ChamberI will not detain noble Lords very long either. Two things have motivated me to be involved in this Bill. One is protection for vulnerable adults and the second is looking at this legislation with my Scottish head on, because nobody else seems to be looking at it from the perspective of the devolved Administrations.
First, on protection for vulnerable adults, we have already debated the fact that in an earlier iteration of this Bill, there were protections. These have been watered down and we now have the triple shield. Whether they fit here, with the amendment from my noble friend Lady Stowell, or fit earlier, what we are all asking for is the reinstatement of risk assessments. I come at this from a protection of vulnerable groups perspective, but I recognise that others come at it from a freedom of expression perspective. I do not think the Minister has answered my earlier questions. Why have risk assessments been taken out and why are they any threat? It seems to be the will of the debate today that they do nothing but strengthen the transparency and safety aspects of the Bill, wherever they might be put.
I speak with trepidation to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. I flatter myself that his amendment and mine are trying to do a similar thing. I will speak to my amendment when we come to the group on devolved issues, but I think what both of us are trying to establish is, given that the Bill is relatively quiet on how freedom of expression is defined, how do platforms balance competing rights, particularly in the light of the differences between the devolved Administrations?
The Minister will know that the Hate Crime and Public Order (Scotland) Act 2021 made my brain hurt when trying to work out how this Bill affects it, or how it affects the Bill. What is definitely clear is that there are differences between the devolved Administrations in how freedom of expression is interpreted. I will study the noble and learned Lord’s remarks very carefully in Hansard; I need a little time to think about them. I will listen very carefully to the Minister’s response and I look forward to the later group.
My Lords, I too will be very brief. As a member of the Communications and Digital Committee, I just wanted to speak in support of my noble friend Lady Stowell of Beeston and her extremely powerful speech, which seems like it was quite a long time ago now, but it was not that long. I want to highlight two things. I do not understand how, as a number of noble Lords have said, having risk assessments is a threat to freedom of expression. I think the absolute opposite is the case. They would enhance all the things the noble Baroness, Lady Fox, is looking to see in the Bill, just as much as they would enhance the protections that my noble friend, who I always seem to follow in this debate, is looking for.
Like my noble friend, I ask the Minister: why not? When the Government announced the removal of legal but harmful and the creation of user empowerment tools, I remember thinking—in the midst of being quite busy with Covid—“What are user empowerment tools and what are they going to empower me to do?” Without a risk assessment, I do not know how we answer that question. The risk is that we are throwing that question straight to the tech companies to decide for themselves. A risk assessment provides the framework that would enable user empowerment tools to do what I think the Government intend.
Finally, I too will speak against my noble friend Lord Moylan’s Amendment 294 on psychological harm. It is well documented that tech platforms are designed to drive addiction. Addiction can be physiological and psychological. We ignore that at our peril.
My Lords, it is a pleasure to have been part of this debate and to have heard how much we are on common ground. I very much hope that, in particular, the Minister will have listened to the voices on the Conservative Benches that have very powerfully put forward a number of amendments that I think have gained general acceptance across the Committee.
I fully understand the points that the noble Lord, Lord Black, made and why he defends Clause 14. I hope we can have a more granular discussion about the contents of that clause rather than wrap it up on this group of amendments. I do not know whether we will be able to have that on the next group.
I thank the noble Baroness, Lady Stowell, for putting forward her amendment. It is very interesting, as the noble Baronesses, Lady Bull and Lady Fraser, said, that we are trying to get to the same sort of mechanisms of risk assessment, perhaps out of different motives, but we are broadly along the same lines and want to see them for adult services. We want to know from the Minister why we cannot achieve that, basically. I am sure we could come to some agreement between us as to whether user empowerment tools or terms of service are the most appropriate way of doing it.
We need to thank the committee that the noble Baroness chairs for having followed up on the letter to the Secretary of State for DCMS, as was, on 30 January. It is good to see a Select Committee using its influence to go forward in this way.
The amendments tabled by the noble Lord, Lord Kamall, and supported by my noble friend Lady Featherstone—I am sorry she is unable to be here today, as he said—are important. They would broaden out consideration in exactly the right kind of way.
However, dare I say it, probably the most important amendment in this group is Amendment 48 in the name of the noble Lord, Lord Stevenson. Apart from the Clause 14 stand part notice, it is pretty much bang on where the Joint Committee got to. He was remarkably tactful in not going into any detail on the Government’s response to that committee. I will not read it out because of the lateness of the hour, but the noble Viscount, Lord Colville, got pretty close to puncturing the Government’s case that there is no proper definition of public interest. It is quite clear that there is a perfectly respectable definition in the Human Rights Act 1998 and, as the noble Viscount said, in the Defamation Act 2013, which would be quite fit for purpose. I do not quite know why the Government responded as they did at paragraph 251. I very much hope that the Minister will have another look at that.
The amendment from the noble and learned Lord, Lord Hope, which has the very respectable support of Justice, is also entirely apposite. I very much hope that the Government will take a good look at that.
Finally, and extraordinarily, I have quite a lot of sympathy with the amendments from the noble Lord, Lord Moylan. It was all going so well until we got to Amendment 294; up to that point I think he had support from across the House, because placing that kind of duty on Ofcom would be a positive way forward.
As I say, getting a clause of the kind that the noble Lord, Lord Stevenson, has put forward, with that public interest content point and with an umbrella duty on freedom of expression, allied to the definition from the noble and learned Lord, Lord Hope, would really get us somewhere.
(1 year, 6 months ago)
Lords ChamberMy Lords, this is my first contribution to the Bill, and I feel I need to apologise in advance for my lack of knowledge and expertise in this whole field. In her initial remarks, the noble Baroness, Lady Morgan of Cotes, was saying “Don’t worry, because you don’t need to be a lawyer”. Unfortunately, I do not have any expertise in the field of the internet and social media and all of that as well, so I will be very brief in all of my remarks on the Bill. But I feel that I cannot allow the Bill to go past without at least making a few remarks, as equalities spokesperson for the Lib Dems. The issues are of passionate importance to me, and of course to victims of online abuse, and it is those victims for whom I speak today.
In this group, I will address my remarks to Amendments 34 and 35, in which we have discussed content deemed to be harmful—suicide, self-harm, eating disorders and abuse and hate content—under the triple shield approach, although this content discussion has strayed somewhat during the course of the debate.
Much harmful material, as we have heard, initially comes to the user uninvited. I do not pretend to understand how these algorithms work, but my understanding is that if you open one, they literally click into action, increasing more and more of this kind of content being fed to you in your feed. The suicide of young Molly Russell is a typical example of the devastating consequences of how much damage these algorithms can contribute. I am glad that the Bill will go further to protect children, but it still leaves adults—some young and vulnerable—without some protection and with the same amount of automatic exposure to harmful content, which algorithms can increase with engagement, which could have overwhelming impacts on their mental health, as my noble friend Lady Parminter so movingly and eloquently described.
So this amendment means a user would have to make an active, conscious choice to be exposed to such content: an opt out rather than an opt in. This has been discussed at length by noble Lords a great deal more versed in the subject than me. But surely the only persons or organisations who would not support this would be the ones who do not have the best interests of the vulnerable users we have been talking about this afternoon at heart. I hope the Minister will confirm in his remarks that the Government do.
My Lords, I had not intended to speak in this debate because I now need to declare an unusual interest, in that Amendment 38A has been widely supported outside this Chamber by my husband, the Member of Parliament for Weston-super-Mare. I am not intending to speak on that amendment but, none the less, I mention it just in case.
I rise to speak because I have been so moved by the speeches, not least the right reverend Prelate’s speech. I would like just to briefly address the “default on” amendments and add my support. Like others, on balance I favour the amendments in the name of the noble Lord, Lord Clement-Jones, but would willingly throw my support behind my noble friend Lady Morgan were that the preferred choice in the Chamber.
I would like to simply add two additional reasons why I ask my noble friend the Minister to really reflect hard on this debate. The first is that children become teenagers, who become young adults, and it is a gradual transition—goodness, do I feel it as the mother of a 16 year-old and a 17 year-old. The idea that on one day all the protections just disappear completely and we require our 18 year-olds to immediately reconfigure their use of all digital tools just does not seem a sensible transition to adulthood to me, whereas the ability to switch off user empowerment tools as you mature as an adult seems a very sensible transition.
Secondly, I respect very much the free speech arguments that the noble Baroness, Lady Fox, made but I do not think this is a debate about the importance of free speech. It is actually about how effective the user empowerment tools are. If they are so hard for non-vulnerable adults to turn off, what hope have vulnerable adults to be able to turn them on? For the triple shield to work and the three-legged stool to be effective, the onus needs to be on the tech companies to make these user empowerment tools really easy to turn on and turn off. Then “default on” is not a restriction on freedom of speech at all; it is simply a means of protecting our most vulnerable.
My Lords, this has been a very thoughtful and thought-provoking debate. I start very much from the point of view expressed by the noble Baroness, Lady Kidron, and this brings the noble Baroness, Lady Buscombe, into agreement—it is not about the content; this is about features. The noble Baroness, Lady Harding, made exactly the same point, as did the noble Baroness, Lady Healy—this is not about restriction on freedom of speech but about a design feature in the Bill which is of crucial importance.
When I was putting together the two amendments that I have tabled, I was very much taken by what Parent Zone said in a recent paper. It described user empowerment tools as “a false hope”, and rightly had a number of concerns about undue reliance on tools. It said:
“There is a real danger of users being overwhelmed and bewildered”.
It goes on to say that
“tools cannot do all the work, because so many other factors are in play—parental styles, media literacy and technological confidence, different levels of vulnerability and, crucially, trust”.
The real question—this is why I thought we should look at it from the other side of things in terms of default—is about how we mandate the use of these user empowerment tools in the Bill for both children and adults. In a sense, my concerns are exactly the opposite of those of the noble Baroness, Lady Fox—for some strange, unaccountable reason.
The noble Baroness, Lady Morgan, the noble Lord, Lord Griffiths, the right reverend Prelate and, notably, my noble friend Lady Parminter have made a brilliant case for their amendment, and it is notable that these amendments are supported by a massive range of organisations. They are all in this area of vulnerable adults: the Mental Health Foundation, Mind, the eating disorder charity Beat, the Royal College of Psychiatrists, the British Psychological Society, Rethink Mental Illness, Mental Health UK, and so on. It is not a coincidence that all these organisations are discussing this “feature”. This is a crucial aspect of the Bill.
Again, I was very much taken by some of the descriptions used by noble Lords during the debate. The right reverend Prelate the Bishop of Oxford said that young people do not suddenly become impervious to content when they reach 18, and he particularly described the pressures as the use of AI only increases. I thought the way the noble Baroness, Lady Harding, described the progression from teenagehood to adulthood was extremely important. There is not some sort of point where somebody suddenly reaches the age of 18 and has full adulthood which enables then to deal with all this content.
Under the Bill as it stands, adult users could still see and be served some of the most dangerous content online. As we have heard, this includes pro-suicide, pro-anorexia and pro-bulimia content. One has only to listen to what my noble friend Lady Parminter had to say to really be affected by the operation, if you like, of social media in those circumstances. This is all about the vulnerable. Of course, we know that anorexia has the highest mortality rate of any mental health problem; the NHS is struggling to provide specialist treatment to those who need it. Meanwhile, suicide and self-harm-related content remains common and is repeatedly implicated in deaths. All Members here who were members of the Joint Committee remember the evidence of Ian Russell about his daughter Molly. I think that affected us all hugely.
We believe now you can pay your money and take your choice of whichever amendment seems appropriate. Changing the user empowerment provisions to require category 1 providers to have either the safest options as default for users or the terms of my two amendments is surely a straightforward way of protecting the vast majority of internet users who do not want this material served to them.
You could argue that the new offence of encouragement to serious self-harm, which the Government have committed to introducing, might form part of the solution here, but you cannot criminalise all the legal content that treads the line between glorification and outright encouragement. Of course, we know the way the Bill has been changed. No similar power is proposed, for instance, to address eating disorder content.
The noble Baroness, Lady Healy, quoted our own Communications and Digital Committee and its recommendations about a comprehensive toolkit of settings overseen by Ofcom, allowing users to decide what types of content they see and from whom. I am very supportive of Amendment 38A from the noble Lord, Lord Knight, which gives a greater degree of granularity about the kind of user, in a sense, that can communicate to users.
Modesty means that of course I prefer my own amendments and I agree with the noble Baronesses, Lady Fraser, Lady Bull and Lady Harding, and I am very grateful for their support. But we are all heading in the same direction. We are all arguing for a broader “by default” approach. The onus should not be on these vulnerable adults in particular to switch them on, as the noble Baroness, Lady Bull, said. It is all about those vulnerable adults and we must, as my noble friend Lady Burt, said, have their best interests at heart, and that is why we have tabled these amendments.
(1 year, 6 months ago)
Lords ChamberThat is right. What is interesting about that useful intervention from the noble Lord, Lord Bethell, is that that kind of gets search off the hook in respect of gambling. You are okay to follow the link from the search engine, but then you are age-gated at the point of the content. Clearly, with thumbnail images and so on in search, we need something better than that. The Bill requires something better than that already; should we go further? My question to the Minister is whether this could be similar to the discussion we had with the noble Baroness, Lady Harding, around non-mandatory codes and alternative methods. I thought that the Minister’s response in that case was quite helpful.
Could it be that if Part 3 and category 2A services chose to use age verification, they could be certain that they are compliant with their duties to protect children from pornographic and equivalent harmful content, but if they chose age-assurance techniques, it would then be on them to show Ofcom evidence of how that alternative method would still provide the equivalent protection? That would leave the flexibility of age assurance; it would not require age verification but would still set the same bar. I merely offer that in an attempt to be helpful to the Minister, in the spirit of where the Joint Committee and the noble Lord, Lord Clement-Jones, were coming from. I look forward to the Minister’s reply.
Before the noble Lord sits down, can I ask him whether his comments make it even more important that we have a clear and unambiguous definition of age assurance and age verification in the Bill?
I would not want to disagree with the noble Baroness for a moment.
My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.
I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.
I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.
I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?
What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?
My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.
I rise very much to support the comments of my noble friend Lord Bethell and, like him, to thank the Minister for bringing forward the government amendments. I will try to address some of the comments the noble Baroness, Lady Fox, has just made.
One must view this as an exercise in working out how one drives culture change in some of the biggest and most powerful organisations in the world. Culture change is really hard. It is hard enough in a company of 10 people, let alone in a company with hundreds of thousands of employees across the world that has more money than a single country. That is what this Bill requires these enormous companies to do: to change the way they operate when they are looking at an inevitably congested, contested technology pipeline, by which I mean—to translate that out of tech speak—they have more work to do than even they can cope with. Every technology company, big or small, always has this problem: more good ideas than their technologists can cope with. They have to prioritise what to fix and what to implement. For the last 15 years, digital companies have prioritised things that drive income, but not the safety of our children. That requires a culture change from the top of the company.
(1 year, 6 months ago)
Lords ChamberIt is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.
As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.
Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as
“anything communicated by means of an internet service”,
but the examples in the Bill, including
“written material … music and data of any description”,
once again fail to include design features that are so often the key drivers of harm to children.
On day three of Committee, the Minister said:
“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]
However, in looking at the child safety duties, Clause 11(5) says:
“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,
but subsection (14) says:
“The duties set out in subsections (3) and (6)”—
which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—
“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.
I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.
Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that
“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.
Then, he said that
“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.
His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.
I turn now to Amendments 28 and 82, which cut the reference to the
“size and capacity of the provider of the service”
in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.
Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.
By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.
Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words
“the volume of the content and the frequency with which the content is accessed”
to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.
My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.
I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.
I am afraid I may well have done.
That list shows your Lordships some of the challenges we all have with the Bill. All these amendments seek to ensure that the codes of practice relating to child safety are binding. Such codes should be principles-based and flexible to allow companies to take the most appropriate route of compliance, but implementing these codes should be mandatory, rather than, as the Bill currently sets out, platforms being allowed to use “alternative measures”. That is what all these amendments do—they do exactly the same thing. That was a clear and firm recommendation from the joint scrutiny committee. The government’s response to that joint scrutiny committee report was really quite weak. Rather than rehearse the joint scrutiny committee’s views, I will rehearse the Government’s response and why it is not good enough to keep the Bill as it stands.
The first argument the Government make in their response to the joint scrutiny report is that there is no precedent for mandatory codes of conduct. But actually there are. There is clear precedent in child protection. In the physical world, the SEND code for how we protect some of our most vulnerable children is mandatory. Likewise, in the digital world, the age-appropriate design code, which we have mentioned many a time, is also mandatory. So there is plenty of precedent.
The second concern—this is quite funny—was that stakeholders were concerned about having multiple codes of conduct because it could be quite burdensome on them. Well, forgive me for not crying too much for these enormous tech companies relative to protecting our children. The burden I am worried about is the one on Ofcom. This is an enormous Bill, which places huge amounts of work on a regulator that already has a very wide scope. If you make codes of conduct non-mandatory, you are in fact making the work of the regulator even harder. The Government themselves in their response say that Ofcom has to determine what the minimum standards should be in these non-binding codes of practice. Surely it is much simpler and more straightforward to make these codes mandatory and, yes, to add potentially a small additional burden to these enormous tech companies to ensure that we protect our children.
The third challenge is that non-statutory guidance already looks as if it is causing problems in this space. On the video-sharing platform regime, which is non-mandatory, Ofcom has already said that in its first year of operation it has
“seen a large variation in platforms’ readiness to engage with Ofcom”.
All that will simply make it harder and harder, so the burden will lie on this regulator—which I think all of us in this House are already worried is being asked to do an awful lot—if we do not make it very clear what is mandatory and what is not. The Secretary of State said of the Bill that she is
“determined to put these vital protections for … children … into law as quickly as possible”.
A law that puts in place a non-mandatory code of conduct is not what parents across the country would expect from that statement from the Secretary of State. People out there—parents and grandparents across the land—would expect Ofcom to be setting some rules and companies to be required to follow them. That is exactly what we do in the physical world, and I do not understand why we would not want to do it in the digital world.
Finally—I apologise for having gone on for quite a long time—I will very briefly talk specifically to Amendment 32A, in the name of the noble Lord, Lord Knight, which is also in this group. It is a probing amendment which looks at how the Bill will address and require Ofcom and participants to take due regard of VPNs: the ability for our savvy children—I am the mother of two teenage girls—to get round all this by using a VPN to access the content they want. This is an important amendment and I am keen to hear what my noble friend Minister will say in response. Last week, I spoke about my attempts to find out how easy it would be for my 17 year-old daughter to access pornography on her iPhone. I spoke about how I searched in the App Store on her phone and found that immediately a whole series of 17-plus-rated apps came up that were pornography sites. What I did not mention then is that with that—in fact, at the top of the list—came a whole series of VPN apps. Just in case my daughter was naive enough to think that she could just click through and watch it, and Apple was right that 17 year-olds were allowed to watch pornography, which obviously they are not, the App Store was also offering her an easy route to access it through a VPN. That is not about content but functionality, and we need to properly understand why this bundle of amendments is so important.
Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—
If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?
My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.
I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.
Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.
More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.
This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.
The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.
Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.
Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.
As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.
I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.
My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—
My noble friend the Minister did not address the concern I set out that the Bill’s approach will overburden Ofcom. If Ofcom has to review the suitability of each set of alternative measures, we will create an even bigger monster than we first thought.
I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.
Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.
Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.
Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.
As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.
My Lords, I rise on this group of amendments, particularly with reference to Amendments 25, 78, 187 and 196, to inject a slight note of caution—I hope in a constructive manner—and to suggest that it would be the wrong step to try to incorporate them into this legislation. I say at the outset that I think the intention behind these amendments is perfectly correct; I do not query the intention of the noble Lord, Lord Russell, and others. Indeed, one thing that has struck me as we have discussed the Bill is the commonality of approach across the Chamber. There is a strong common desire to provide a level of protection for children’s rights, but I question whether these amendments are the right vehicle by which to do that.
It is undoubtedly the case that the spirit of the UNCRC is very strongly reflected within the Bill, and I think it moves in a complementary fashion to the Bill. Therefore, again, I do not query the UNCRC in particular. It can act as a very strong guide to government as to the route it needs to take, and I think it has had a level of influence on the Bill. I speak not simply as someone observing the Bill but as someone who, in a previous existence, served as an Education Minister in Northern Ireland and had direct responsibility for children’s rights. The guidance we received from the UNCRC was, at times, very useful to Ministers, so I do not question any of that.
For three reasons, I express a level of concern about these amendments. I mentioned that the purpose of the UNCRC is to act as a guide—a yardstick—for government as to what should be there in terms of domestic protections. That is its intention. The UNCRC itself was never written as a piece of legislation, and I do not think it was the original intention to have it directly incorporated and implemented as part of law. The UNCRC is aspirational in nature, which is very worth while. However, it is not written in a legislative form. At times, it can be a little vague, particularly if we are looking at the roles that companies will play. At times, it sets out very important principles, but ones which, if left for interpretation by the companies themselves, could create a level of tension.
To give an example, there is within the UNCRC a right to information and a right to privacy. That can sometimes create a tension for companies. If we are to take the purpose of the UNCRC, it is to provide that level of guidance to government, to ensure that it gets it right rather than trying to graft UNCRC directly on to domestic law.
Secondly, the effect of these amendments would be to shift the interpretation and implementation of what is required of companies from government to the companies themselves. They would be left to try to determine this, whereas I think that the UNCRC is principally a device that tries to make government accountable for children’s rights. As such, it is appropriate that government has the level of responsibility to draft the regulations, in conjunction with key experts within the field, and to try to ensure that what we have in these regulations is fit for purpose and bespoke to the kind of regulations that we want to see.
To give a very good example, there are different commissioners across the United Kingdom. One of the key groups that the Government should clearly be consulting with to make sure they get it right is the Children’s Commissioners of the different jurisdictions in the United Kingdom. Through that process, but with that level of ownership still lying with government and Ofcom, we can create regulations that provide the level of protection for our children that we all desire to see; whereas, if the onus is effectively shifted on to companies simply to comply with what is a slightly vague, aspirational purpose in these regulations, that is going to lead to difficulties as regards interpretation and application.
Thirdly, there is a reference to having due regard to what is in the UNCRC. From my experience, both within government and even seeing the way in which government departments do that—and I appreciate that “due regard” has case law behind it—even different government departments have tended to interpret that differently and in different pieces of legislation. At one extreme, on some occasions that effectively means that lip service has been paid to that by government departments and, in effect, it has been largely ignored. Others have seen it as a very rigorous duty. If we see that level of disparity between government departments within the same Government, and if this is to be interpreted as a direct instruction to and requirement of companies of varying sizes—and perhaps with various attitudes and feelings of responsibility on this subject—that creates a level of difficulty in and of itself.
My final concern in relation to this has been mentioned in a number of debates on various groups of amendments. Where a lot of Peers would see either a weakness in the legislation or something else that needs to be improved, we need to have as much consistency and clarity as possible in both interpretation and implementation. As such, the more we move away from direct regulations, which could then be put in place, to relying on the companies themselves interpreting and implementing, perhaps in different fashions, with many being challenged by the courts at times, the more we create a level of uncertainty and confusion, both for the companies themselves and for users, particularly the children we are looking to protect.
While I have a lot of sympathy for the intention of the noble Lord, Lord Russell, and while we need to find a way to incorporate into the Bill in some form how we can drive children’s rights more centrally within this, the formulation of the direct grafting of the UNCRC on to this legislation, even through due regard, is the wrong vehicle for doing it. It is inappropriate. As such, it is important that we take time to try to find a better vehicle for the sort of intention that the noble Lord, Lord Russell, and others are putting forward. Therefore, I urge the noble Lord not to press his amendments. If he does, I believe that the Committee should oppose the amendments as drafted. Let us see if, collectively, we can find a better and more appropriate way to achieve what we all desire: to try to provide the maximum protection in a very changing world for our children as regards online safety.
My Lords, I support these amendments. We are in the process of having a very important debate, both in the previous group and in this one. I came to this really important subject of online safety 13 years ago, because I was the chief executive of a telecoms company. Just to remind noble Lords, 13 years ago neither Snap, TikTok nor Instagram—the three biggest platforms that children use today—existed, and telecoms companies were viewed as the bad guys in this space. I arrived, new to the telecoms sector, facing huge pressure—along with all of us running telecoms companies—from Governments to block content.
I often felt that the debate 13 years ago too quickly turned into what was bad about the internet. I was spending the vast majority of my working day trying to encourage families to buy broadband and to access this thing that you could see was creating huge value in people’s lives, both personal and professional. Sitting on these Benches, I fundamentally want to see a society with the minimum amount of regulation, so I was concerned that regulating internet safety would constrain innovation; I wanted to believe that self-regulation would work. In fact, I spent many hours in workshops with the noble Baroness, Lady Kidron, and many others in this Chamber, as we tried to persuade and encourage the tech giants—as everyone started to see that it was not the telecoms companies that were the issue; it was the emerging platforms—to self-regulate. It is absolutely clear that that has failed. I say that with quite a heavy heart; it has genuinely failed, and that is why the Bill is so important: to enshrine in law some hard regulatory requirements to protect children.
That does not change the underlying concern that I and many others—and everyone in this Chamber—have, that the internet is also potentially a force for good. All technology is morally neutral: it is the human beings who make it good or bad. We want our children to genuinely have access to the digital world, so in a Bill that is enshrining hard gates for children, it is really important that it is also really clear about the rights that children have to access that technology. When you are put under enormous pressure, it is too easy—I say this as someone who faced it 13 years ago, and I was not even facing legislation—to try to do what you think your Government want to do, and then end up causing harm to the individuals you are actually trying to protect. We need this counterbalance in this Bill. It is a shame that my noble friend Lord Moylan is not in his place, because, for the first time in this Committee, I find myself agreeing with him. It is hugely important that we remember that this is also about freedom and giving children the freedom to access this amazing technology.
Some parts of the Bill are genuinely ground-breaking, where we in this country are trying to work out how to put the legal scaffolding in place to regulate the internet. Documenting children’s rights is not something where we need to start from scratch. That is why I put my name to this amendment: I think we should take a leaf from the UN Convention on the Rights of the Child. I recognise that the noble Lord, Lord Weir of Ballyholme, made some very thought-provoking comments about how we have to be careful about the ambiguity that we might be creating for companies, but I am afraid that ambiguity is there whether we like it or not. These are not just decisions for government: the tension between offering services that will brighten the lives of children but risking them as well are exactly behind the decisions that technology companies take every day. As the Bill enshrines some obligations on them to protect children from the harms, I firmly believe it should also enshrine obligations on them to offer the beauty and the wonder of the internet, and in doing that enshrine their right to this technology.
I want to challenge the noble Baroness’s assertion that the Bill is not about children’s rights. Anyone who has a teenage child knows that their right to access the internet is keenly held and fought out in every household in the country.
The quip works, but political rights are not quips. Political rights have responsibilities, and so on. If we gave children rights, they would not be dependent on adults and adult society. Therefore, it is a debate; it is a row about what our rights are. Guess what. It is a philosophical row that has been going on all around the world. I am just suggesting that this is not the place—
(1 year, 7 months ago)
Lords ChamberMy Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.
I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.
Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?
My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.
The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.
My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.
My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.
I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.
I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.
I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.
On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.
There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.
Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.
So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.
My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.
While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.
I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.
(1 year, 7 months ago)
Lords ChamberMy Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.
I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.
The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.
One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.
Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.
I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.
I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?
My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.
I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.
I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.
I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.
The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.
I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.
The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.
That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.
I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.
In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.
As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.
It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.
The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.
It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.
Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.
I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.
Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.
I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.
I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.
As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.
It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.
First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as
“anything communicated by means of an internet service”.
Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.
Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.
Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.
In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?
(1 year, 7 months ago)
Lords ChamberMy Lords, I add my support for all the amendments in this group. I thank the noble Baroness, Lady Ritchie, for bringing the need for the consistent regulation of pornographic content to your Lordships’ attention. Last week, I spoke about my concerns about pornography; I will not repeat them here. I said then that the Bill does not go far enough on pornography, partly because of the inconsistent regulation regimes between Part 3 services and Part 5 ones.
In February, the All-Party Parliamentary Group on Commercial Sexual Exploitation made a series of recommendations on the regulation of pornography. Its first recommendation was this:
“Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres”.
It went on to say:
“The reforms currently contained in the Online Safety Bill not only fail to remedy this, they introduce further inconsistencies in how different online platforms hosting pornography are regulated”.
This is our opportunity to get it right but we are falling short. The amendments in the name of the noble Baroness, Lady Ritchie, go to the heart of the issue by ensuring that the duties that currently apply to Part 5 services will also apply to Part 3 services.
Debates about how these duties should be amended or implemented will be dealt with later on in our deliberations; I look forward to coming back to them in detail then. Today, the question is whether we are willing to have inconsistent regulation of pornographic content across the services that come into the scope of the Bill. I am quite sure that, if we asked the public in an opinion poll whether this was the outcome they expected from the Bill, they would say no.
An academic paper published in 2021 reported on the online viewing of 16 and 17 year-olds. It said that pornography was much more frequently viewed on social media, showing that the importance of the regulation of such sites remains. The impact of pornography is no different whether it is seen on a social media or pornography site with user-to-user facilities that fall within Part 3 or on a site that has only provider content that would fall within Part 5. There should not be an either/or approach to different services providing the same content, which is why I think that Amendment 125A is critical. If all pornographic content is covered by Part 5, what does and does not constitute user-generated material ceases to be our concern. Amendment 125A highlights this issue; I too look forward to hearing the Minister’s response.
There is no logic to having different regulatory approaches in the same Bill. They need to be the same and come into effect at the same time. That is the simple premise of these amendments; I fully support them.
My Lords, earlier today the noble Baroness, Lady Benjamin, referred to a group of us as kindred spirits. I suggest that all of us contributing to this debate are kindred spirits in our desire to see consistent outcomes. All of us would like to see a world where our children never see pornography on any digital platform, regardless of what type of service it is. At the risk of incurring the ire of my noble friend Lord Moylan, we should have zero tolerance for children seeing and accessing pornography.
I agree with the desire to be consistent, as the noble Baroness, Lady Ritchie, and the noble Lord, Lord Browne, said, but it is consistency in outcomes that we should focus on. I am very taken with the point made by the noble Lord, Lord Allan, that we must be very careful about the unintended consequences of a consistent regulatory approach that might end up with inconsistent outcomes.
When we get to it later—I am not sure when—I want to see a regulatory regime that is more like the one reflected in the amendments tabled by the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. We need in the Bill a very clear definition of what age assurance and age verification are. We must be specific on the timing of introducing the regulatory constraints on pornography. We have all waited far too long for that to happen and that must be in the Bill.
I am nervous of these amendments that we are debating now because I fear other unintended consequences. Not only does this not incentivise general providers, as the noble Lord, Lord Allan, described them, to remove porn from their sites but I fear that it incentivises them to remove children from their sites. That is the real issue with Twitter. Twitter has very few child users; I do not want to live in a world where our children are removed from general internet services because we have not put hard age gates on the pornographic content within them but instead encouraged those services to put an age gate on the front door. Just as the noble Lord, Lord Allan, said earlier today, I fear that, with all the best intentions, the desire to have consistent outcomes and these current amendments would regulate the high street rather than the porn itself.
My Lords, there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us. It is probably about the construction of the Bill, rather than the duties that we are imposing.
It is a pleasure again to follow the noble Baroness, Lady Harding. If you take what my noble friend Lord Allan said about a graduated response and consistent outcomes, you then get effective regulation.
I thought that the noble Baroness, Lady Kidron, had it right. If we passed her amendments in the second group, and included the words “likely to be accessed”, Clause 11 would bite and we would find that there was consistency of outcomes for primary priority content and so on, and we would then find ourselves in much the same space. However, it depends on the primary purpose. The fear that we have is this. I would not want to see a Part 5 service that adds user-generated content then falling outside Part 5 and finding itself under Part 3, with a different set of duties.
I do not see a huge difference between Part 3 and Part 5, and it will be very interesting when we come to debate the later amendments tabled by the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron. Again, why do we not group these things together to have a sensible debate? We seem to be chunking-up things in a different way and so will have to come back to this and repeat some of what we have said. However, I look forward to the debate on those amendments, which may be a much more effective way of dealing with this than trying to marry Part 5 and Part 3.
I understand entirely the motives of the noble Baroness, Lady Ritchie, and that we want to ensure that we capture this. However, it must be the appropriate way of regulating and the appropriate way of capturing it. I like the language about consistent outcomes without unintended consequences.
(1 year, 7 months ago)
Lords ChamberMy Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.
The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.
There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.
Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,
“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.
It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.
Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.
Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.
The Bill requires
“a significant number of children”
to use the service, or for the service to be
“likely to attract a significant number of users who are children”.
“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.
Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is
“likely to be accessed by children”
if
“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”
that are likely to be accessed.
Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.
Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.
When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.
Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.
Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.
Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.
My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.
My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.
I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare
“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”
services reaching children. Amendment 22 would mandate app stores
“to use proportionate and proactive measures, such as age assurance, to prevent children”
coming into contact with
“primary priority content that is harmful to children”.
Amendments 298 and 299 would simply define “app” and “app stores”.
Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.
Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.
Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.
The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.
There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.
Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?
The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.
My Lords, as my name is on Amendment 9, I speak to support these amendments and say that they are worthy of debate. As your Lordships know, I am extremely supportive of the Bill and hope that it will be passed in short order. It is much needed and overdue that we have the opportunity for legislation to provide us with a regulator that is able to hold platforms to account, protect users where it can and enhance child safety online. I can think of no better regulator for that role than Ofcom.
I have listened to the debate with great interest. Although I support the intentions of my noble friend Lord Moylan’s amendment, I am not sure I agree with him that there are two cultures in this House, as far as the Bill is concerned; I think everybody is concerned about child safety. However, these amendments are right to draw attention to the huge regulatory burden that this legislation can potentially bring, and to the inadvertent bad consequences it will bring for many of the sites that we all depend upon and use.
I have not signed many amendments that have been tabled in this Committee because I have grown increasingly concerned, as has been said by many others, that the Bill has become a bit like the proverbial Christmas tree where everyone hangs their own specific concern on to the legislation, turning it into something increasingly unwieldy and difficult to navigate. I thought the noble Baroness, Lady Fox, put it extremely well when she effectively brought to life what it would be like to run a small website and have to comply with this legislation. That is not to say that certain elements of micro-tweaking are not welcome—for example, the amendment by the noble Baroness, Lady Kidron, on giving coroners access to data—but we should be concerned about the scope of the Bill and the burden that it may well put on individual websites.
This is in effect the Wikipedia amendment, put forward and written in a sort of wiki way by this House—a probing amendment in Committee to explore how we can find the right balance between giving Ofcom the powers it needs to hold platforms to account and not unduly burdening websites that all of us agree present a very low risk and whose provenance, if you like, does not fit easily within the scope of the Bill.
I keep saying that I disagree with my noble friend Lord Moylan. I do not—I think he is one of the finest Members of this House—but, while it is our job to provide legislation to set the framework for how Ofcom regulates, we in this House should also recognise that in the real world, as I have also said before, this legislation is simply going to be the end of the beginning. Ofcom will have to find its way forward in how it exercises the powers that Parliament gives it, and I suspect it will have its own list of priorities in how it approaches these issues, who it decides to hold to account and who it decides to enforce against. A lot of its powers will rest not simply on the legislation that we give it but on the relationship that it builds with the platforms it is seeking to regulate.
For example, I have hosted a number of lunches for Google in this House with interested Peers, and it has been interesting to get that company’s insight into its working relationship with Ofcom. By the way, I am by no means suggesting that that is a cosy relationship, but it is at least a relationship where the two sides are talking to each other, and that is how the effectiveness of these powers will be explored.
I urge noble Lords to take these amendments seriously and take what the spirit of the amendments is seeking to put forward, which is to be mindful of the regulatory burden that the Bill imposes; to be aware that the Bill will not, simply by being passed, solve the kinds of issues that we are seeking to tackle in terms of the most egregious content that we find on the internet; and that, effectively, Ofcom’s task once this legislation is passed will be the language of priorities.
My Lords, this is not the first time in this Committee, and I suspect it will not be the last, when I rise to stand somewhere between my noble friend Lord Vaizey and the noble Baroness, Lady Kidron. I am very taken by her focus on risk assessments and by the passionate defences of Wikipedia that we have heard, which really are grounded in a sort of commoner’s risk assessment that we can all understand.
Although I have sympathy with the concerns of the noble Baroness, Lady Fox, about small and medium-sized businesses being overburdened by regulation, I am less taken with the amendments on that subject precisely because small tech businesses become big tech businesses extremely quickly. It is worth pointing out that TikTok did not even exist when Parliament began debating this Bill. I wonder what our social media landscape would have been like if the Bill had existed in law before social media started. We as a country should want global tech companies to be born in the UK, but we want their founders—who, sadly, even today, are predominantly young white men who do not yet have children—to think carefully about the risks inherent in the services they are creating, and we know we need to do that at the beginning of those tech companies’ journeys, not once they have reached 1 million users a month.
While I have sympathy with the desire of the noble Baroness, Lady Fox, not to overburden, just as my noble friend Lord Vaizey has said, we should take our lead from the intervention of the noble Baroness, Lady Kidron: we need a risk assessment even for small and medium-sized businesses. It just needs to be a risk assessment that is fit for their size.
Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—
I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.
My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.