Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I rise very briefly to support the amendments in the name of the noble Baroness, Lady Stowell, and the noble Lord, Lord Stevenson. Like other speakers, I put on record my support for the regulator being offered independence and Parliament having a role.
However, I want to say one very brief and minor thing about timing—I feel somewhat embarrassed after the big vision of the noble Baroness, Lady Stowell. Having had quite a lot of experience of code making over the last three years, I experienced the amount of time that the department was able to take in responding to the regulator as being a point of power, a point of lobbying, as others have said, and a point of huge distraction. For those of us who have followed the Bill for five years and as many Secretaries of State, we should be concerned that none of the amendments has quite tackled the question of time.
The idea of acting within a timeframe is not without precedent; the National Security and Investment Act 2021 is just one recent example. What was interesting about that Act was that the reason given for the Secretary of State’s powers being necessary was as a matter of national security—that is, they were okay and what we all agree should happen—but the reason for the time restriction was for business stability. I put it to the Committee that the real prospect of children and other users being harmed requires the same consideration as business stability. Without a time limit, it is possible that inaction can be used to control or simply fritter away.
My Lords, I will make a short contribution on this substantive question of whether concerns about ministerial overreach are legitimate. Based on a decade of being on the receiving end of representations from Ministers, the short answer is yes. I want to expand on that with some examples.
My experience of working on the other side, inside a company, was that you often got what I call the cycle of outrage: something is shared on social media that upsets people; the media write a front-page story about it; government Ministers and other politicians get involved; that then feeds back into the media and the cycle spins up to a point where something must be done. The “something” is typically that the Minister summons people, such as me in my old job, and brings them into an office. That itself often becomes a major TV moment, where you are brought in, browbeaten and sent out again with your tail between your legs, and the Minister has instructed you to do something. That entire process takes place in the political rather than the regulatory domain.
I readily concede that, in many cases, something of substance needed to be addressed and there was a genuine problem. It is not that this was illegitimate, but these amendments are talking about the process for what we should do when that outrage is happening. I agree entirely with the tablers of the amendments that, to the extent that that process can be encapsulated within the regulator rather than a Minister acting on an ad hoc basis, it would be a significant improvement.
I also note that this is certainly not UK-specific, and it would happen in many countries with varying degrees of threat. I remember being summoned to the Ministry of the Interior in Italy to meet a gentleman who has now sadly passed. He brought me into his office, sat me down, pointed to his desk and said “You see that desk? That was Mussolini’s desk”. He was a nice guy and I left with a CD of his rhythm and blues band, but it was clear that I was not supposed to say no to him. He made a very clear and explicit political direction about content that was on the platform.
One big advantage of this Bill is that it has the potential to move beyond that world. It could move from individual people in companies—the noble Baroness, Lady Stowell of Beeston, made this point very powerfully—to changing the accountability model away from either platforms being entirely accountable themselves or platforms and others, including Ministers, somehow doing deals that will have an impact, as the noble Baroness, Lady Fox, and the noble Viscount, Lord Colville, said, on the freedom of expression of people across the country. We do not want that.
We want to move on in the Bill and I think we have a model which could work. The regulator will take on the outrage and go as far as it can under the powers granted in the Bill. If the regulator believes that it has insufficient powers, it will come back to Parliament and ask for more. That is the way in which the system can and should work. I think I referred to this at Second Reading; we have an opportunity to create clear accountability. Parliament instructs Ofcom, which instructs the platforms. The platforms do what Ofcom says, or Ofcom can sanction them. If Ofcom feels that its powers are deficient, it comes back to Parliament. The noble Lord, Lord Stevenson, and others made the point about scrutiny and us continually testing whether Ofcom has the powers and is exercising them correctly. Again, that is entirely beneficial and the Government should certainly be minded to accept those amendments.
With the Secretary of State powers, as drafted in the Bill and without the amendments we are considering today, we are effectively taking two steps forward and one step back on transparency and accountability. We have to ask: why take that step back when we are able to rely on Ofcom to do the job without these directions?
The noble Baroness, Lady Stowell of Beeston, made the point very clearly that there are other ways of doing this. The Secretary of State can express their view. I am sure that the Minister will be arguing that the Secretary of State’s powers in the Bill are better than the status quo because at least what the Secretary of State says will be visible; it will not be a back-room deal. The noble Baroness, Lady Stowell of Beeston, has proposed a very good alternative, where the Secretary of State makes visible their intentions, but not in the form of an order—rather in the form of advice. The public—it is their speech we are talking about—then have the ability to see whether they agree with Ofcom, the companies or the Secretary of State if there is any dispute about what should happen.
It is certainly the case that visible instructions from the Secretary of State would be better, but the powers as they are still leave room for arm-twisting. I can imagine a future scenario in which future employees of these platforms are summoned to the Secretary of State. But now the Secretary of State would have a draft order sitting there. The draft order is Mussolini’s desk. They say to the people from the platforms, “Look, you can do what I say, or I am going to send an order to Ofcom”. That takes us back to this world in which the public are not seeing the kind of instructions being given.
I hope that the Government will accept that some amendment is needed here. All the ones that have been proposed suggest different ways of achieving the same objective. We are trying to protect future Secretaries of State from an unhealthy temptation to intervene in ways that they should not.
My Lords, it is a privilege to introduce Amendments 123A, 142, 161 and 184 in my name and those of the noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. These amendments represent the very best of your Lordships’ House and, indeed, the very best of Parliament and the third sector because they represent an extraordinary effort to reach consensus between colleagues across the House including both opposition parties, many of the Government’s own Benches, a 40-plus group of Back-Bench Conservatives and the Opposition Front Bench in the other place. Importantly, they also enjoy the support of the commercial age check sector and a vast array of children’s charities and, in that regard, I must mention the work of Barnardo’s, CEASE and 5Rights, which have really led the charge.
I will spend the bulk of my time setting out in detail the amendments themselves, and I will leave my co-signatories and others to make the arguments for them. Before I do, I once again acknowledge the work of the noble Baroness, Lady Benjamin, who has been fighting this fight for many years, and the noble Baroness, Lady Harding, whose characteristic pragmatism was midwife to the drafting process. I also acknowledge the time spent talking about this issue with the Secretary of State, the noble Lord the Minister and officials at DSIT. I thank them for their time and their level of engagement.
Let me first say a few words about age assurance and age verification. Age assurance is the collective term for all forms and levels of age verification, which means an exact age, and age estimation, which is an approximate or probable age. Age assurance is not a technology; it is any system that seeks to achieve a level of certainty about the age or age range of a person. Some services with restricted products and services have no choice but to have the very highest level of assurance or certainty—others less so.
To be clear at the outset, checking someone’s age, whether by verification or estimation, is not the same as establishing identity. While it is absolutely the case that you can establish age as a subset of establishing someone’s identity, the reverse is not necessarily true. Checking someone’s age does not need to establish their identity.
Age assurance strategies are multifaceted. As the ICO’s guidance in the age-appropriate design code explains, online services can deploy a range of methods to achieve the necessary level of certainty about age or age range. For example, self-verification, parental authentication, AI estimation and/or the use of passports and other hard identifiers may all play a role in a single age assurance strategy, or any one of them may be a mechanism in itself in other circumstances. This means that the service must consider its product and make sure that the level of age assurance meets the level of risk.
Since we first started debating these issues in the context of the Digital Economy Act 2017, the technology has been transformed. Today, age assurance might just as effectively be achieved by assessing the fluidity of movement of a child dancing in a virtual reality game as by collecting their passport. The former is over 94% accurate within five seconds and is specific to that particular child, while a passport may be absolute but less reliable in associating the check with a particular child. So, in the specific context of that dancing child, it is likely that the former gives the greater assurance. When a service’s risk profile requires absolute or near absolute certainty—for example, any of the risks that are considered primary priority harms, including, but not limited to, pornography—having the highest possible level of assurance must be a precondition of access.
Age assurance can also be used to ensure that children who are old enough to use a service have an age-appropriate experience. This might mean disabling high-risk features such as hosting, livestreaming or private messaging for younger children, or targeting child users or certain age groups with additional safety, privacy and well-being interventions and information. These amendments, which I will get to shortly, are designed to ensure both. To achieve the levels of certainty and privacy which are widely and rightly demanded, the Bill must both reflect the current state of play and anticipate nascent and emerging technology that will soon be considered standard.
That was a long explanation, for which I apologise, but I hope it makes it clear that there is no single approach, but, rather, a need to clearly dictate a high bar of certainty for high-risk services. A mixed economy of approaches, all geared towards providing good outcomes for children, is what we should be promoting. Today we have the technology, the political will and the legislative mechanism to make good on our adult responsibilities to protect children online. While age assurance is eminently achievable, those responsible for implementing it and, even more importantly, those subject to it need clarity on standards; that is to say, rules of the road. In an era when data is a global currency, services have shown themselves unable to resist the temptation to repurpose information gleaned about the age of their users, or to facilitate the access to industrial amounts of harmful material for children for commercial gain. As with so many of tech’s practices, this has eroded trust and heightens the need for absolute clarity on how services build their age-assurance systems and what they do—and do not do—with the information they gather, and the efficacy and security of the judgments they make.
Amendment 125A simply underlines the point made frequently in Committee by the noble Baroness, Lady Ritchie of Downpatrick, that the Bill should make it clear that pornography should not be judged by where it is found but by the nature of the material itself. It would allow Ofcom to provide guidance on pornographic material that should be behind an age gate, either in Part 3 or Part 5.
Amendment 142 seeks to insert a new clause setting out matters that Ofcom must reflect in its guidance for effective age assurance; these are the rules of the road. Age assurance must be secure and maintain the highest levels of privacy; this is paramount. I do not believe I need to give examples of the numerous data leaks but I note the excessive data harvesting undertaken by some of the major platforms. Age assurance must not be an excuse to collect users’ personal and sensitive information unnecessarily, and it should not be sold, stored or used for other purposes, such as advertising, or offered to third parties.
Age assurance must be proportionate to the risk, as per the results of the child risk assessment, and let me say clearly that proportionality is not a route to allow a little bit of porn or a medium amount of self-harm, or indeed a lot of both, to a small number of children. In the proposed new clause, proportionality means that if a service is high-risk, it must have the highest levels of age assurance. Equally, if a service is low-risk or no-risk, it may be that no age assurance is necessary, or it should be unobtrusive in order to be proportionate. Age-assurance systems must provide mechanisms to challenge or change decisions to ensure that everyone can have confidence in their use, and they do not keep individuals—adults or children—out of spaces they have the right to be in. It must be inclusive and accessible so that children with specific accessibility needs are considered at the point of its design, and it must provide meaningful information so that users can understand the mode of operation. I note that the point about accessibility is of specific concern to the 5Rights young advisers. Systems must be effective. It sounds foolish to say so, but look at where we are now, when law in the US, Europe, the UK and beyond stipulates age restrictions and they are ignored to the tune of tens of millions of children.
Age assurance is not to rely solely on the user to provide information; a tick box confirming “I am 18” is not sufficient for any service that carries a modicum of risk. It must be compatible with the following laws: the Data Protection Act, the Human Rights Act, the Equality Act and the UNCRC. It must have regard to the risks and opportunities of interoperable age assurance, which, in the future, will see these systems seamlessly integrated into our services, just as opening your phone with your face, or using two-factor authentication when transferring funds, are already normalised. It must consult with the Information Commissioner and other persons relevant to technological expertise and an understanding of child development.
On that point, I am in full support of the proposal from the noble Lord, Lord Allan, to require Ofcom to produce regular reports on age-assurance technology, and see his amendment as a necessary companion piece to these amendments. Importantly, the amendment stipulates that the guidance should come forward in six months and that all systems of age assurance, whether estimated or verified, whether operated in-house or by third-party providers, and all technologies must adhere to the same principles. It allows Ofcom to point to technical standards in its guidance, which I know that the ISO and the IEEE are currently drafting with this very set of principles in mind.
My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.
I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.
The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.
I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.
But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.
I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.
On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.
On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.
The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.
I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.