All 10 Debates between Lord Clement-Jones and Lord Knight of Weymouth

Wed 6th Sep 2023
Wed 19th Jul 2023
Mon 17th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Digital Markets, Competition and Consumers Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I was looking forward to hearing the noble Lord, Lord Knight, introduce these amendments but, owing to a glitch in timing when tabling the amendments, I am unfortunately in the hot seat this afternoon. As well as moving Amendment 2, I will speak to Amendments 18, 23, 56 and 61.

These amendments, developed by the Institute for the Future of Work, are aimed in particular at highlighting the direct and indirect impacts on job creation, displacement and conditions and on the work environment in the UK, which are important considerations that are relevant to competition and should be kept closely under review. I look forward to hearing what the noble Lord, Lord Knight, says, as co-chair of the All-Party Parliamentary Group on the Future of Work, which helped the Institute for the Future of Work to develop the amendments.

Digital markets and competition are shaping models for work, the distribution of work, access to work and the conditions and quality of work for several different reasons. Digital connected worker and labour platforms are used across the economy, not just for online or gig work. There is concentration in digital markets, with the emergence of a few dominant actors such as Amazon and Uber, which impacts the number and nature of local jobs created or lost. There are specific anti-competitive practices, such as wage and price fixing, which is currently subject to litigation in the US, and there are secondary and spillover impacts from all the above, including the driving of new models of business that may constrain wages, terms and work quality, directly or indirectly.

A good example is cloud-based connected worker platforms, which use behavioural and predictive algorithms to nudge and predict performance, match and allocate work and set standards. There is also increased market dominance in cloud computing, on which a growing number of UK businesses depend. For example, Amazon Web Services leads four companies in control of 67% of world cloud infrastructure and over 30% of the market.

Other examples are algorithmic hiring, job matching and task-allocation systems, which are trained on data that represents past practices and, as a result, can exclude or restrict groups from labour market opportunities. Social, environmental and well-being risks and impacts, including on work conditions and environments, are under increasing scrutiny from both the consumer and the corporate sustainability perspective—seen, for instance, in the World Economic Forum’s Global Risks Report 2024, and the EU’s new corporate sustainability due diligence directive, due to be formally approved this year, which obliges firms to integrate their human rights and environmental impact into their management systems.

This suggests that consumer interests can extend to local and supply-chain impacts, and informed decision-making will need better information on work impacts. For a start, key definitions such as “digital activity” in Clause 4 need to take into account impacts on UK work and workers in determining whether there is a sufficient link to the UK. Amendment 2 is designed to do this. Secondly, the CMA’s power to impose conduct requirements in Chapter 3 of the Bill should make sure that a designated undertaking can be asked to carry out and share an assessment on work impacts. Similarly, the power in Chapter 4, Clause 46, to make pro-competition interventions, which hinges on having an adverse effect, should be amended to include certain adverse impacts on work. Amendments 18, 23 and 56 are designed to do this.

Thirdly, information and understanding about work impacts should be improved and monitored on an ongoing basis. For example, the CMA should also be able to require an organisation to undertake an assessment to ascertain impacts on work and workers as part of a new power to seek information in Clause 69. This would help investigations carried out to ascertain relevant impacts and decide whether to exercise powers and functions in the Bill.

Evidence is emerging of vertical price fixing at a platform level, which might directly impact the pay of UK workers, including payment of the minimum wage and, therefore, compliance with labour law, as well as customer costs. Such anti-competitive practices via digital platforms are not limited to wages, or gig, remote or office work. Ongoing research on the gigification of work includes connected worker platforms, which tend to be based on the cloud. This is indicative of tight and increasing control, and the retention of scale advantages as these platforms capture information from the workplace to set standards, penalise or incentivise certain types of behaviour, and even advise on business models, such as moving to more flexible and less secure contracts. At the more extreme end, wages are driven so low that workers have no choice but to engage in game-like compensation packages that offer premiums for completion of a high number of tasks in short or unsociable periods of time, engage in risk behaviours or limit mobility.

The Institute for the Future of Work has developed a model which could serve as a basis for this assessment: the good work algorithmic impact assessment. The UK Information Commissioner’s Office grants programme supports it and it is published on the DSIT website. The assessment covers the 10 dimensions of the Good Work Charter, which serves as a checklist of workplace impacts in the context of the digitisation of work: work that promotes dignity, autonomy and equality; work that has fair pay and conditions; work where people are properly supported to develop their talents and have a sense of community. The proposed good work AIA is designed to help employers and engineers to involve workers and their representatives in the design, development and deployment of algorithmic systems, with a procedure for ongoing monitoring.

In summary, these amendments would give the CMA an overarching duty to monitor and consider all these impacts as part of monitoring adverse effects on competition and/or a relevant public interest. We should incorporate this important aspect of digital competition into the Bill. I beg to move.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I congratulate the noble Lord, Lord Clement-Jones, on the way he occupied the hot seat and introduced his amendments. I had hoped to add my name to them but other things prevented me doing so. As he said, I co-chair the All-Party Group on the Future of Work with Matt Warman in the other place. I am grateful to the Institute for the Future of Work, and to Anna Thomas in particular for her help in putting these amendments together.

I start with a reflection on industrialisation, which in its own way created a massive explosion in economic activity and wealth, and the availability of goods and opportunities. There was innovation and it was good for consumers, but it also created considerable harms to the environment and to workers. The trade union movement grew up as a result of that.

In many ways, the technological revolution that we are going through, which this legislation seeks to address and, in part, regulate, is no different. As the Minister said a few moments ago, we see new opportunities with the digital tools and products that are being produced as part of this revolution, more jobs, more small and medium-sized enterprises able to grow, more innovation and more opportunities for consumers. These are all positive benefits that we should celebrate when we think about and support the Bill, as we do on all sides of the Committee.

However, the risks for workers, and the other social and environmental risks, are too often ignored. The risks to workers were totally ignored in the AI summit that was held by the Government last year. That is a mistake. During the Industrial Revolution, it took Parliament quite a while to get to the Factory Acts, and to the legislation needed to provide the protection for society and the environment. We might be making the same mistake again, at a time when people are being hired by algorithm and, as the noble Lord, Lord Clement-Jones, pointed out, managed by algorithm, particularly at the lower end of the labour market and in more insecure employment.

The Institute for the Future of Work’s report, The Amazonian Era, focused on the logistics sector. If you were ever wondering why your Amazon delivery arrives with a knock on the door but there is nobody there when you open it to say hello and check that the parcel has been delivered, it is because the worker does not have time to stop and check that someone is alive on the other side of the door—they have to get on. They are being managed by machine to achieve a certain level of productivity. They are wearing personalised devices that monitor how long their loo breaks are if they are working in the big warehouses. There is a huge amount of technological, algorithmic management of workers that is dehumanising and something which we should all be concerned about.

In turn, having been hired and managed by algorithms, people may well be being fired by algorithm as well. We have seen examples—for example, Amazon resisting trade union recognition in a dispute with the GMB, as the trade union movement also tries to catch up with this and do something about it. Recently, we saw strikes in the creative sector, with writers and artists concerned about the impact on their work of algorithms being used to create and that deskilling them rapidly. I have been contacted by people in the education world who are exam markers—again, they are being managed algorithmically on the throughput of the exams that they have to mark, despite this being an intensive, knowledge-based, reflective activity of looking at people’s scripts.

In this legislation we have a “user”, “consumer”, “worker” problem, in that all of them might be the same person. We are concerned here about users and consumers, but fail to recognise that the same person may also be a worker, now being sold, as part of an integrated service, with the technology, and at the wrong end of an information asymmetry. We have lots of data that is consumer-centric, and lots of understanding about the impacts on consumers, but very little data on the impact of their function as a worker.

In the United States, we have seen the Algorithmic Accountability Act. Last month, the Council of Europe published its recommendations on AI. Both are shifting the responsibility towards the companies, giving them a burden of proof to ensure that they are meeting reasonable standards around worker rights and conditions, environmental protection and so on. These amendments seek to do something similar. They want impacts on work, and on workers in particular, to be taken into account in SMS designation, competition decisions, position of conduct requirements and compliance reports. It may be that, if the Government had delivered on their promise of many years now to deliver an employment Bill, we could have dealt with some of these things in that way. But we do not have that opportunity and will not have it for some time.

As I have said, the collective bargaining option for workers is extremely limited; the digital economy has had very limited penetration of trade union membership. It is incumbent on your Lordships’ House to use the opportunities of digital legislation to see whether we can do something to put in place a floor of minimum standards for the way in which vulnerable workers across the economy, not just in specific digital companies, are subject to algorithmic decision-making that is to their disadvantage. We need to do something about it.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for his introduction today and also for his letter which set out the reasons and the very welcome amendments that he has tabled today. First, I must congratulate the noble Baroness, Lady Stowell, for her persistence in pushing amendments of this kind to Clause 45, which will considerably increase the transparency of the Secretary of State’s directions if they are to take place. They are extremely welcome as amendments to Clause 45.

Of course, there is always a “but”—by the way, I am delighted that the Minister took the advice of the House and clearly spent his summer reading through the Bill in great deal, or we would not have seen these amendments, I am sure—but I am just sorry that he did not take the opportunity also to address Clause 176 in terms of the threshold for powers to direct Ofcom in special circumstances, and of course the rather burdensome powers in relation to the Secretary of State’s guidance on Ofcom’s exercise of its functions under the Bill as a whole. No doubt we will see how that works out in practice and whether they are going to be used on a frequent basis.

My noble friend Lord Allan—and I must congratulate both him and the noble Lord, Lord Knight, for their addressing this very important issue—has set out five assurances that he is seeking from the Minister. I very much hope that the Minister can give those today, if possible.

Congratulations are also due to the noble Baroness, Lady Kennedy, for finding a real loophole in the offence, which has now been amended. We are all delighted to see that the point has been well taken.

Finally, on the point raised by the noble Lord, Lord Rooker, clearly it is up to the Minister to respond to the points made by the committee. All of us would have preferred to see a comprehensive scheme in the primary legislation, but we are where we are. We wanted to see action on apps; they have some circumscribing within the terms of the Bill. The terms of the Bill—as we have discussed—particularly with the taking out of “legal but harmful”, do not give a huge amount of leeway, so this is not perhaps as skeleton a provision as one might otherwise have thought. Those are my reflections on what the committee has said.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I do not know how everyone has spent their summer, but this feels a bit like we have been working on a mammoth jigsaw puzzle and we are now putting in the final pieces. At times, through the course of this Bill, it has felt like doing a puzzle in the metaverse, where we have been trying to control an unreliable avatar that is actually assembling the jigsaw—but that would be an unfair description of the Minister. He has done really well in reflecting on what we have said, influencing his ministerial colleagues in a masterclass of managing upwards, and coming up with reasonable resolutions to previously intractable issues.

We are trusting that some of the outcome of that work will be attended to in the Commons, as the noble Baroness, Lady Morgan, has said, particularly the issues that she raised on risk, that the noble Baroness, Lady Kidron, raised on children’s safety by design, and that my noble friend Lady Merron raised on animal cruelty. We are delighted at where we think these issues have got to.

For today, I am pleased that the concerns of the noble Baroness, Lady Stowell, on Secretary of State powers, which we supported, have been addressed. I also associate myself with her comments on parliamentary scrutiny of the work of the regulator. Equally, we are delighted that the Minister has answered the concerns of my noble friend Lady Kennedy and that he has secured the legislative consent orders which he informed us of at the outset today. We would be grateful if the Minister could write to us answering the points of my noble friend Lord Rooker, which were well made by him and by the Delegated Powers Committee.

I am especially pleased to see that the issues which we raised at Report on remote access have been addressed. I feel smug, as I had to press quite hard for the Minister to leave the door open to come back at this stage on this. I am delighted that he is now walking through the door. Like the noble Lord, Lord Allan, I have just a few things that I would like clarification on—the proportional use of the powers, Ofcom taking into account user privacy, especially regarding live user data, and that the duration of the powers be time- limited.

Finally, I thank parliamentarians on all sides for an exemplary team effort. With so much seemingly falling apart around us, it is encouraging that, when we have common purpose, we can achieve a lot, as we have with this Bill.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I welcome the Minister’s Amendment 238A, which I think was in response to the DPRRC report. The sentiment around the House is absolutely clear about the noble Baroness’s Amendment 245. Indeed, she made the case conclusively for the risk basis of categorisation. She highlighted Zoe’s experience and I struggle to understand why the Secretary of State is resisting the argument. She knocked down the nine pins of legal uncertainty, and how it was broader than children and illegal by reference to Clause 12. The noble Baroness, Lady Finlay, added to the knocking down of those nine pins.

Smaller social media platforms will, on the current basis of the Bill, fall outside category 1. The Royal College of Psychiatrists made it pretty clear that the smaller platforms might be less well moderated and more permissive of dangerous content. It is particularly concerned about the sharing of information about methods of suicide or dangerous eating disorder content. Those are very good examples that it has put forward.

I return to the scrutiny committee again. It said that

“a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”

should be adopted. It seems that many small, high-harm services will be excluded unless we go forward on the basis set out by the noble Baroness, Lady Morgan. The kind of breadcrumbing we have talked about during the passage of the Bill and, on the other hand, sites such as Wikipedia, as mentioned by noble friend, will be swept into the net despite being low risk.

I have read the letter from the Secretary of State which the noble Baroness, Lady Morgan, kindly circulated. I cannot see any argument in it why Amendment 245 should not proceed. If the noble Baroness decides to test the opinion of the House, on these Benches we will support her.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.

The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.

As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.

We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I add my congratulations to the noble Baroness, Lady Harding, on her tenacity, and to the Minister on his flexibility. I believe that where we have reached is pretty much the right balance. There are the questions that the noble Baroness, Lady Harding, and others have asked of the Minister, and I hope he will answer those, but this is a game-changer, quite frankly. Rightly, the noble Baroness has paid tribute to the companies which have put their head above the parapet. That was not that easy for them to do when you consider that those are the platforms they have to depend on for their services to reach the public.

Unlike the research report, they have reserved powers that the Secretary of State can use if the report is positive, which I hope it will be. I believe this could be a turning point. The digital markets and consumers Bill is coming down the track this autumn and that is going to give greater powers to make sure that the app stores can be tackled—after all, there are only two of them and they are an oligopoly. They are the essence of big tech, and they need to function in a much more competitive way.

The noble Baroness talked about timing, and it needs to be digital timing, not analogue. Four years does seem a heck of a long time. I hope the Minister will address that.

Then there is the really important aspect of harmful content. In the last group, the Minister reassured us about systems and processes and the illegality threshold. Throughout, he has tried to reassure us that this is all about systems and processes and not so much about content. However, every time we look, we see that content is there almost by default, unless the subject is raised. We do not yet have a Bill that is actually fit for purpose in that sense. I hope the Minister will use his summer break wisely and read through the Bill to make sure that it meets its purpose, and then come back at Third Reading with a whole bunch of amendments that add functionalities. How about that for a suggestion? It is said in the spirit of good will and summer friendship.

The noble Baroness raised a point about transparency when it comes to Ofcom publishing its review. I hope the Minister can give that assurance as well.

The noble Baroness, Lady Kidron, asked about the definition of app store. That is the gatekeeper function, and we need to be sure that that is what we are talking about.

I end by congratulating once again the noble Baroness and the Minister on where we have got to so far.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I will start with the final point of the noble Lord, Lord Clement-Jones. I remind him that, beyond the world of the smartphone, there is a small company called Microsoft that also has a store for software—it is not just Google and Apple.

Principally, I say well done to the noble Baroness, Lady Harding, in deploying all of her “winsome” qualities to corral those of us who have been behind her on this and then persuade the Minister of the merits of her arguments. She also managed to persuade the noble Lord, Lord Allan of Misery Guts, that this was a good idea. The sequence of research, report, regulation and regulate is a good one, and as the noble Lord, Lord Clement-Jones, reminded us it is being deployed elsewhere in the Bill. I agree with the noble Baroness about the timing: I much prefer two years to four years. I hope that at least Ofcom would have the power to accelerate this if it wanted to do so.

I was reminded of the importance of this in an article I read in the Guardian last week, headed:

“More than 850 people referred to clinic for video game addicts”.


This was in reference to the NHS-funded clinic, the National Centre for Gaming Disorders. A third of gamers receiving treatment there were spending money on loot boxes in games such as “Fortnite”, “FIFA”, “Minecraft”, “Call of Duty” and “Roblox”—all games routinely accessed by children. Over a quarter of those being treated by the centre were children.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I will be extremely brief. We have come a very long way since the Joint Committee made its recommendations to the Government, largely, I think, as a result of the noble Baroness, Lady Kidron. I keep mistakenly calling her “Baroness Beeban”; familiarity breeds formality, or something.

I thank the Minister and the Secretary of State for what they have done, and the bereaved families for having identified these issues. My noble friend Lord Allan rightly identified the sentiments as grief and anger at what has transpired. All we can do is try to do, in a small way, what we can to redress the harm that has already been done. I was really interested in his insights into how a platform will respond and how this will help them through the process of legal order and data protection issues with a public authority.

My main question to the Minister is in that context—the relationship with the Information Commissioner’s Office—because there are issues here. There is, if you like, an overlap of jurisdiction with the ICO, because the potential or actual disclosure of personal data is involved, and therefore there will necessarily have to be co-operation between the ICO and Ofcom to ensure the most effective regulatory response. I do not know whether that has emerged on the Minister’s radar, but it certainly has emerged on the ICO’s radar. Indeed, in the ideal world, there probably should be some sort of consultation requirement on Ofcom to co-operate with the Information Commissioner in these circumstances. Anything that the Minister can say on that would be very helpful.

Again, this is all about reassurance. We must make sure that we have absolutely nailed down all the data protection issues involved in the very creative way the Government have responded to the requests of the bereaved families so notably championed by the noble Baroness, Lady Kidron.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, first, I associate myself with the excellent way in which the noble Baroness, Lady Harding, paid tribute to the work of the noble Baroness, Lady Kidron, on behalf of Bereaved Families for Online Safety, and with the comments she made about the Minister and the Secretary of State in getting us to this point, which were echoed by others.

I have attached my name, on behalf of the Opposition, to these amendments on the basis that if they are good enough for the noble Baroness, Lady Kidron, it ought to be good enough for me. We should now get on with implementing them. I am also hopeful to learn that the Minister has been liaising with the noble Baroness, Lady Newlove, to ensure that the amendments relating to coroners’ services, and the equivalent procurator fiscal service in Scotland, will satisfy her sense of what will work for victims. I am interested, also, in the answer to the question raised by the noble Baroness, Lady Kidron, regarding a requirement for senior managers to attend inquests. I liked what she had to say about the training for coroners being seeing as media literacy and therefore fundable from the levy.

All that remains is for me to ask three quick questions to get the Minister’s position clear regarding the interpretation of the new Chapter 3A, “Deceased Child Users”. First, the chapter is clear that terms of service must clearly and easily set out policy for dealing with the parents of a deceased child, and must provide a dedicated helpline and a complaints procedure. In subsection (2), does a helpline or similar—the “similar” being particularly important—mean that the provider must offer an accessible, responsive and interactive service? Does that need to be staffed by a human? I think it would be helpful for the Minister to confirm that is his intention that it should be, so that parents are not fobbed off with solely an automated bot-type service.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, that was a bravura performance by the noble Lord, Lord Lexden. We thank him. To those listening in the Public Gallery, I should say that we debated most of those; it was not quite as on the nod as it looked.

Amendment 286ZA, in the name of my noble friend Lord Stevenson, seeks to address a critical issue in our digital landscape: the labelling of AI-generated content on social media platforms.

As we navigate the ever-evolving world of technology, it is crucial that we uphold a transparency safeguarding the principles of honesty and accountability. Social media has become an integral part of our lives, shaping public discourse, disseminating information and influencing public opinion. However, the rise of AI-powered algorithms and tools has given rise to a new challenge: an increasing amount of content generated by artificial intelligence without explicit disclosure.

We live in an age where AI is capable of creating incredibly realistic text, images and even videos that can be virtually indistinguishable from those generated by humans. While this advancement holds immense potential, it also raises concerns regarding authenticity, trust and the ethical implications of AI-generated content. The proposed amendment seeks to address this concern by advocating for a simple but powerful solution—labelling AI-generated content as such. By clearly distinguishing human-generated content from AI-generated content, we empower individuals to make informed decisions about the information they consume, promoting transparency and reducing the potential for misinformation or manipulation.

Labelling AI-generated content serves several crucial purposes. First and foremost, it allows individuals to differentiate between information created by humans and that generated by algorithms in an era where misinformation and deep fakes pose a significant threat to public trust. Such labelling becomes a vital tool to protect and promote digital literacy.

Secondly, it enables users to better understand the potential biases and limitations of AI-generated content. AI algorithms are trained on vast datasets, and without labelling, individuals might unknowingly attribute undue credibility to AI-generated information, assuming it to be wholly objective and reliable. Labelling, however, helps users to recognise the context and provides an opportunity for critical evaluation.

Furthermore, labelling AI-generated content encourages responsible behaviour from the platforms themselves. It incentivises social media companies to develop and implement AI technologies with integrity and transparency, ensuring that users are aware of the presence and influence of AI in their online experiences.

Some may argue that labelling AI-generated content is an unnecessary burden or that it could stifle innovation. However, the intention behind this amendment is not to impede progress but to foster a healthier digital ecosystem built on trust, integrity and informed decision-making. By promoting transparency, we can strike a balance that allows innovation to flourish while safeguarding the interests of individuals and society as a whole.

In conclusion, the amendment to label AI-generated content on social media platforms represents a crucial step forward in addressing the challenges of the digital age. By embracing transparency and empowering individuals, we can foster a more informed and discerning society. Let us lead by example and advocate for a digital landscape that values accountability, integrity and the rights of individuals. I urge your Lordships to support this amendment as we strive to build a future where technology works hand-in-hand with humanity for the betterment of all.

In the spirit of the amendment, I must flag that my entire speaking note was generated by AI, as the noble Lord, Lord Allan, from his expression, had clearly guessed. In using this tool, I do so not to belittle the amendment but to illustrate that these tools are already infiltrating everyday life and can supercharge misinformation. We need to do something to ease internet users in trusting what they read.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Does the noble Lord agree that the fact that we did not notice his speech was generated by AI somewhat damages his argument?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, the proposers of these amendments have made a very good case to answer. My only reservation is that I think there are rather more subtle and proportionate ways of dealing with this—I take on board entirely what the noble Lord, Lord Bethell, says.

I keep coming back to the deliberations that we had in the Joint Committee. We said:

“All statutory requirements on user-to-user services, for both adults and children, should also apply to Internet Society Services likely to be accessed by children, as defined by the Age Appropriate Design Code”.


This goes back to the test that we described earlier, to

“ensure all pornographic websites would have to prevent children from accessing their content”,

and back to that definition,

“likely to be accessed by children”.

The Government keep resisting this aspect, but it is a really important way of making sure that we deal with this proportionately. We are going to have this discussion about minimum age-assurance standards. Rather than simply saying, “It has to be age verification”, if we had a set of principles for age assurance, which can encompass a number of different tools and approaches, that would also help with the proportionality of what we are talking about.

The Government responded to the point we made about age assurance. The noble Baroness, Lady Kidron, was pretty persuasive in saying that we should take this on board in our Joint Committee report, and she had a Private Member’s Bill at the ready to show us the wording, but the Government came back and said:

“The Committee’s recommendations stress the importance of the use of age assurance being proportionate to the risk that a service presents”.


They have accepted that this would be a proportionate way of dealing with it, so this is not black and white. My reservation is that there is a better way of dealing with this than purely driving through these three or four amendments, but there is definitely a case for the Government to answer on this.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I think the whole Committee is grateful to my noble friend Lady Ritchie for introducing these amendments so well.

Clearly, there is a problem. The anecdote from the noble Baroness, Lady Kidron, about the call she had had with the barrister relating to those freshers’ week offences, and the sense that people were both offenders and victims, underscored that. In my Second Reading speech I alluded to the problem of the volume of young people accessing pornography on Twitter, and we see the same on Reddit, Discord and a number of other platforms. As the noble Baroness said, it is changing what so many young people perceive to be normal about sexual relationships, and that has to be addressed.

Ofcom very helpfully provided a technical briefing on age assurance and age verification for Members of your Lordships’ House—clearly it did not persuade everybody, otherwise we would not be having this debate. Like the noble Lord, Lord Clement-Jones, I am interested in this issue of whether it is proportionate to require age verification, rather than age assurance.

For example, on Amendment 83 in my noble friend’s name in respect of search, I was trying to work out in my own mind how that would work. If someone used search to look for pornographic content and put in an appropriate set of keywords but was not logged in—so the platform would not know who they are—and if age verification was required, would they be interrupted with a requirement to go through an age-verification service before the search results were served up? Would the search results be served up but without the thumbnails of images and with some of the content suppressed? I am just not quite sure what the user experience would be like with a strict age-verification regime being used, for example, in respect of search services.

Online Safety Bill

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.

This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.

The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.

The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.

As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.

I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.

However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.

We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.

I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.

The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.

This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.

I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.

If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.

I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.

What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.

The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.

In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.

Data Protection Bill [HL]

Debate between Lord Clement-Jones and Lord Knight of Weymouth
Monday 30th October 2017

(7 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I thank the noble Baroness for that accolade. I rise to speak to Amendment 170, which is a small contribution to perfecting Amendment 169. It struck me as rather strange that Amendment 152 has a reference to charities, but not Amendment 169. For charities, this is just as big an issue so I wanted to enlarge slightly on that. This is a huge change that is overtaking charities. How they are preparing for it and the issues that need to be addressed are of great concern to them. The Institute of Fundraising recently surveyed more than 300 charities of all sizes on how they are preparing for the GDPR, and used the results to identify a number of areas where it thought support was needed.

The majority of charities, especially the larger ones, are aware of the GDPR and are taking action to get ready for May 2018, but the survey also highlighted areas where charities need additional advice, guidance and support. Some 22% of the charities surveyed said that they have yet to do anything to prepare for the changes, and 95% of those yet to take any preparatory action are the smaller charities. Some 72% said that there was a lack of clear available guidance. Almost half the charities report that they do not feel they have the right level of skills or expertise on data protection, and 38% report that they have found limits in their administration or database systems, or the costs of upgrading these, a real challenge. That mirrors very much what small businesses are finding as well. Bodies such as the IoF have been working to increase the amount of support and guidance on offer. The IoF runs a number of events, but more support is needed.

A targeted intervention is needed to help charities as much as it is needed for small business. This needs to be supported by government—perhaps through a temporary extension of the existing subsidised fundraising skills training, including an additional training programme on how to comply with GDPR changes; or a targeted support scheme, directly funded or working with other funding bodies and foundations, to help the smallest charities most in need to upgrade their administrative or database systems. Charities welcome the recently announced telephone service from the ICO offering help on the GDPR, which they can access, but it is accessible only to organisations employing under 250 people and it is only a telephone service.

There are issues there, and I hope the Minister will be able to respond, in particular by recognising that charities are very much part of the infrastructure of smaller organisations that will certainly need support in complying with the GDPR.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I broadly support what these interesting amendments are trying to do. I declare my interest as a member of the board of the Centre for Acceleration of Social Technology. Substantially, what it does is advise normally larger charities on how to best take advantage of digital to solve some of their problems.

Clearly, I support ensuring that small businesses, small charities and parish councils, as mentioned, are advised of the implications of this Act. If she has the opportunity, I ask the noble Baroness, Lady Neville-Rolfe, to explain why she chose staff size as the measure. I accept that hers is a probing amendment and she may think there are reasons not to go with staff size. The cliché is that when Instagram was sold to Facebook for $1 billion it had 13 members of staff. That would not come within the scope of the amendment, but there are plenty of digital businesses that can achieve an awful lot with very few staff. As it stands, my worry is this opens up a huge loophole.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - - - Excerpts

My Lords, I will be brief on this group but I have two points to make. One is a question in respect of Amendment 51, where I congratulate the insurance industry on its lobbying. Within proposed new paragraph 15A(1)(b) it says,

“if … the controller has taken reasonable steps to obtain the data subject’s consent”.

Can the Minister clarify, or give some sense of, what “reasonable” means in this context? It would help us to understand whether that means an email, which might go into spam and not be read. Would there be a letter or a phone call to try to obtain consent? What could we as citizens reasonably expect insurance companies to do to get our consent?

Assuming that we do not have a stand part debate on Clause 4, how are the Government getting on with thinking about simplifying the language of the Bill? The noble Baroness, Lady Lane-Fox, is temporarily not in her place, but she made some good points at Second Reading about simplification. Clause 4 is quite confusing to read. It is possible to understand it once you have read it a few times, but subsection (2) says, for example, that,

“the reference to a term’s meaning in the GDPR is to its meaning in the GDPR read with any provision of Chapter 2 which modifies the term’s meaning for the purposes of the GDPR”.

That sort of sentence is quite difficult for most people to understand, and I will be interested to hear of the Government’s progress.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I thank the noble Baroness for introducing these amendments in not too heavy a style, but this is an opportunity to ask a couple of questions in relation to them. We may have had since 20 October to digest them; nevertheless, that does not make them any more digestible. We will be able to see how they really operate only once they are incorporated into the Bill. Perhaps we might have a look at how they operate on Report.

The Bill is clearly a work in progress, and this is an extraordinary number of amendments even at this stage. It begs the question as to whether the Government are still engaged in discussions with outside bodies. Personally, I welcome that there has been dialogue with the insurance industry—a very important industry for us. We obviously have to make sure that the consumer is protected while it carries out an important part of its business. I know that the industry has raised other matters relating to third parties and so on. There have also been matters raised by those in the financial services industry who are keen to ensure that fraud is prevented. Even though they are private organisations, they are also keen to ensure that they are caught under the umbrella of the exemptions in the Bill. Can the noble Baroness tell us a little about what further discussions are taking place? It is important that we make sure that when the Bill finally hits the deck, so to speak, it is right for all the different sectors that will be subject to it.