All 6 Debates between Baroness Kidron and Lord Knight of Weymouth

Wed 12th Jul 2023
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Mon 6th Nov 2017
Data Protection Bill [HL]
Lords Chamber

Committee: 2nd sitting (Hansard): House of Lords

Online Safety Bill

Debate between Baroness Kidron and Lord Knight of Weymouth
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I am interested to hear what the Minister says, but could he also explain to the House the difference in status of this sort of material in Part 5 versus Part 3? I believe that the Government brought in a lot of amendments that sorted it out and that many of us hoped were for the entire Bill, although we discovered, somewhat to our surprise, that they were only in Part 5. I would be interested if the Minister could expand on that.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content

“generated directly on the service by a user”,

which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content

“uploaded to or shared on the service by a user”,

which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.

A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.

Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?

Online Safety Bill

Debate between Baroness Kidron and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.

I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.

I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.

Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.

The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.

The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.

Online Safety Bill

Debate between Baroness Kidron and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I would not want to disagree with the noble Baroness for a moment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

Does the noble Lord think it is also important to have some idea of measurement? Age assurance in certain circumstances is far more accurate than age verification.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.

Online Safety Bill

Debate between Baroness Kidron and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:

“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.


Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.

My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

Online Safety Bill

Debate between Baroness Kidron and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I strongly support my noble friend in his amendment. I clarify that, in doing so, I am occupying a guest slot on the Front Bench: I do so as a member of his team but also as a member of the former Joint Committee. As my noble friend set out, this reflects where we got to in our thinking as a Joint Committee all that time ago. My noble friend said “at last”, and I echo that and what others said. I am grateful for the many briefings and conversations that we have had in the run-up to Committee, but it is good to finally be able to get on with it and start to clear some of these things out of my head, if nothing else.

In the end, as everyone has said, this is a highly complex Bill. Like the noble Baroness, Lady Stowell, in preparation for this I had another go at trying to read the blooming thing, and it is pretty much unreadable —it is very challenging. That is right at the heart of why I think this amendment is so important. Like the noble Baroness, Lady Kidron, I worry that this will be a bonanza for the legal profession, because it is almost impenetrable when you work your way through the wiring of the Bill. I am sure that, in trying to amend it, some of us will have made errors. We have been helped by the Public Bill Office, but we will have missed things and got things the wrong way around.

It is important to have something purposive, as the Joint Committee wanted, and to have clarity of intent for Ofcom, including that this is so much more about systems than about content. Unlike the noble Baroness, Lady Stowell—clearly, we all respect her work chairing the communications committee and the insights she brings to the House—I think that a very simple statement, restricting it just to proposed new paragraph (g), is not enough. It would almost be the same as the description at the beginning of the Bill, before Clause 1. We need to go beyond that to get the most from having a clear statement of how we want Ofcom to do its job and the Secretary of State to support Ofcom.

I like what the noble Lord, Lord Allan, said about the risk of overcommitment and underdevelopment. When the right reverend Prelate the Bishop of Oxford talked about being the safest place in the world to go online, which is the claim that has been made about the Bill from the beginning, I was reminded again of the difficulty of overcommitting and underdelivering. The Bill is not perfect, and I do not believe that it will be when this Committee and this House have finished their work; we will need to keep coming back and legislating and regulating in this area, as we pursue the goal of being the safest place in the world to go online —but it will not be any time soon.

I say to the noble Baroness, Lady Fox, who I respect, that I understand what she is saying about some of her concerns about a risk-free child safety regime and the unintended consequences that may come in this legislation. But at its heart, what motivate us and make us believe that getting the Bill right is one of the most important things we will do in all of our times in this Parliament are the unintended consequences of the algorithms that these tech companies have created in pushing content at children that they do not want to hear. I see the noble Baroness, Lady Kidron, wanting to comment.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I just want to say to the noble Baroness, Lady Fox, that we are not looking to mollycoddle children or put them in cotton wool; we are asking for a system where they are not systematically exploited by major companies.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I very much agree. The core of what I want to say in supporting this amendment is that in Committee we will do what we are here to do. There are a lot of amendments to what is a very long and complicated Bill: we will test the Minister and his team on what the Government are trying to achieve and whether they have things exactly right in order to give Ofcom the best possible chance to make it work. But when push comes to shove at the end of the process, at its heart we need to build trust in Ofcom and give it the flexibility to be able to respond to the changing online world and the changing threats to children and adults in that online world. To do that, we need to ensure that we have the right amount of transparency.

I was particularly pleased to see proposed new paragraph (g) in the amendment, on transparency, as referenced by the noble Baroness, Lady Stowell. It is important that we have independence for Ofcom; we will come to that later in Committee. It is important that Parliament has a better role in terms of accountability so that we can hold Ofcom to account, having given it trust and flexibility. I see this amendment as fundamental to that, because it sets the framework for the flexibility that we then might want to be able to give Ofcom over time. I argue that this is about transparency of purpose, and it is a fundamental addition to the Bill to make it the success that we want.

Data Protection Bill [HL]

Debate between Baroness Kidron and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - - - Excerpts

My Lords, does the Minister agree with the noble Lord, Lord Storey, that PSHE would be the most appropriate way to educate young people about data rights? If so, I note that the Secretary of State, Justine Greening, has today announced that Ian Bauckham will lead the review on how relationship and sex education for the 21st century will be delivered. Can the Minister, who is clearly prepared to think about this appointment today, ask whether it is within his scope to think about how data rights education may be delivered as part of that review, and whether the review will draw on the work of the previous person who reviewed the delivery of PSHE, Sir Alasdair Macdonald, the last time Parliament thought that compulsory SRE was a good idea?

Baroness Kidron Portrait Baroness Kidron
- Hansard - -

I support the amendment. I was on the House of Lords Communications Committee, to which the noble Lord just referred. We recommended that digital literacy be given the same status as reading, writing and arithmetic. We set out an argument for a single cross-curricular framework of digital competencies—evidence-based, taught by trained teachers—in all schools whatever their legal status.

At Second Reading, several noble Lords referred to data as the new oil. I have been thinking about it since: I am not so certain. Oil may one day run out; data is infinite. What I think we can agree is that understanding how data is gathered, used and stored, and, most particularly, how it can be harnessed to manipulate both your behaviour and your digital identity, is a core competency for a 21st-century child. While I agree with the noble Lord that the best outcome would be a single, overarching literacy strategy, this amendment would go some small way towards that.