House of Lords

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
Thursday 27 April 2023
11:00
Prayers—read by the Lord Bishop of Guildford.

Oaths and Affirmations

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
11:06
Baroness Hallett took the oath.

Banks: Closures and Shared Banking Hubs

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Question
11:07
Asked by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- Hansard - - - Excerpts

To ask His Majesty’s Government how many (1) bank branches have closed, and (2) shared banking hubs have opened, in the last 12 months; and what steps they are taking to minimise the former and speed up the latter.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, in begging leave to ask the Question standing in my name on the Order Paper, I declare my financial services interests as set out in the register.

Baroness Penn Portrait The Parliamentary Secretary, HM Treasury (Baroness Penn) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government do not make assessments of bank branch networks or intervene in commercial decisions to close branches. Banks should follow FCA guidance, including considering alternative access where appropriate. One example of this is shared banking hubs. More than 50 hubs have been announced, with four now open, and the pace of delivery is expected to accelerate over the coming months. People can also access everyday banking via their local post office.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - - - Excerpts

My Lords, in the past 12 months, 847 bank branches have closed or are set to close. Four shared banking hubs have opened. Does my noble friend the Minister agree that the Government need to act to ensure local banking provision, including deposit taking as well as withdrawals and advice? They must also act to ensure acceptance of, as well as access to, cash; otherwise, what currency is cash if there is no place to spend it? Finally, will the Government consider carefully commissioning a review into access to digital financial services to ensure that everyone can benefit from all the financial innovations in that space?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as my noble friend will know, in the Financial Services and Markets Bill, we are legislating to protect access to cash. That covers withdrawal as well as deposit services. The Government do not plan to mandate the acceptance of cash. That would be an unprecedented intervention. However, the increased access particularly to deposit services for businesses should allow those who wish to continue to accept cash to be able to do so on a more sustainable footing. My noble friend makes an interesting suggestion. The Government are working hard to ensure financial inclusion, including digital financial inclusion. I will think about his suggestion very carefully.

Baroness Kramer Portrait Baroness Kramer (LD)
- View Speech - Hansard - - - Excerpts

My Lords, getting a smart hub still requires the voluntary participation of the banks, which is part of the reason why the pace of progress has been so slow. Will the Government consider changing the rules so that any community that meets the standards to justify a smart hub, as assessed by LINK, then has an automatic right to that hub and can overcome bank resistance?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government are not considering changing the framework. As I said in response to the Question, we expect the pace of delivery to pick up. Shared banking hubs are one initiative to ensure that communities can continue to access banking. I mentioned the Post Office as being another route: 99% of personal and 95% of business banking customers can carry out their everyday banking there, with more than 11,000 branches across the UK.

Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- View Speech - Hansard - - - Excerpts

My Lords, my noble friend will recollect, no doubt, an earlier Question asked by my noble friend Lord Holmes about the nature of the facilities provided by ATMs and banks, particularly for those with disabilities. Will my noble friend therefore confirm that, in the establishment of these hubs, there will be a requirement on them to be careful to provide the sorts of facilities that are suitable for people with disabilities, as the banks were starting to do?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

My Lords, in taking forward this work, I am sure that that is a consideration the banks have in mind. The banking hubs came out of a pilot programme that allowed banks to test out this model to ensure that it was accessible to all their customers. Of course, they are subject to the equality duty, which also means that they need to make proper provision for those with protected characteristics.

Lord Cormack Portrait Lord Cormack (Con)
- View Speech - Hansard - - - Excerpts

My Lords, legal tender is legal tender. I urge my noble friend to bear in mind that the Government have the opportunity, if they wish, to mandate the use of cash—people can use it when they want. Will she also bear in mind that a lot of people now are being discouraged from writing cheques? Many people like to pay their bills with cheques. All these facilities should remain, certainly for the next two decades.

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government acknowledge the important role that cash still plays in many of our lives, which is why we are taking unprecedented action on protecting access to cash. As I said, ensuring that businesses have access to deposit facilities will also promote ongoing cash acceptance by businesses.

Lord Cormack Portrait Lord Cormack (Con)
- Hansard - - - Excerpts

What about cheques?

Baroness Chapman of Darlington Portrait Baroness Chapman of Darlington (Lab)
- View Speech - Hansard - - - Excerpts

I do not think my children know what a cheque is, actually. The Social Market Foundation and the Treasury Select Committee in the other place have expressed some concern about the overreliance on post offices as a stopgap. Postal staff are wonderful, but they are not trained banking specialists. Does the Minister agree that we need that trusted expertise to be available on our high streets? Does she also agree that some post offices just are not suitable for many of the requirements of face-to-face banking, especially for more vulnerable customers, as they do not provide the privacy and dignity that many bank customers need?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

I agree with the noble Baroness that the Post Office can play a really important role in ensuring ongoing provision, but it should not be the only provider of services. There are other services that are more appropriately delivered in other ways, including in person, which is part of where banking hubs come in. As I have said, we hope to see the delivery of those hubs accelerated this year. It is also reassuring to hear that several banks have committed that if their branch is the last in town, it will stay open until the relevant banking hub is up and running, to ensure continuity of service.

Lord Bishop of Durham Portrait The Lord Bishop of Durham
- View Speech - Hansard - - - Excerpts

In my local town of Bishop Auckland, Newcastle Building Society and Darlington Building Society have moved on to the high street as banks have moved off it. Will the Minister commend building societies for their commitment to local communities and to making things accessible to them, and will she encourage further work on that?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

I absolutely commend building societies and all businesses that have a commitment to local communities and are thinking about how they can make their services as accessible as possible. There are many different routes to ensuring accessibility. We should focus on the outcome for the customer and embrace the different routes that this can be delivered by.

Lord Blunkett Portrait Lord Blunkett (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the bigger the profit, the less customer service there is. This has happened over the last decade. There are still some banks pretending that they are disabled by Covid and that is why you cannot get through on the phone, and the local branch is closed so you cannot actually talk to anyone. Will the Minister ask the banks to start putting the customer first and ensure that there are facilities available, not just at the odd hub but in local communities, which, in the past, could rely on serious, person-to-person customer service?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

My Lords, a process has been put in place to allow communities to make the case through LINK for where they need access to further services, and there is a commitment that if something is deemed necessary, it will be implemented. The noble Lord is right that it is essential that the interests of consumers are properly considered in all areas of financial services. There is the new consumer duty, which is due to be implemented later this year and will take forward some of his suggestions.

Lord Herbert of South Downs Portrait Lord Herbert of South Downs (Con)
- View Speech - Hansard - - - Excerpts

My Lords, in contrast to my noble friend Lord Cormack, I do not know what a cheque is; I thought it was something one received in a restaurant in the United States. I do not carry cash and, in common with millions of people, I pay using contactless technology. Of course, some still need cash, including small businesses, but, as my noble friend says, is not the Post Office network a ready-made, available network for cash, which almost every business can use and is guaranteed in terms of proximity?

Baroness Penn Portrait Baroness Penn (Con)
- View Speech - Hansard - - - Excerpts

My noble friend is right about the breadth of the Post Office network, and I have talked about the high percentages of people who can access their everyday banking services through it. It is also geographically widespread; 93% of the UK’s population live within one mile of a post office and 99.7% within three miles of their nearest post office. There are other services that people need to be able to access, which is why it is important that we encourage banks to continue to innovate so that people can access the services in the way that is most appropriate for them.

Climate Change Committee: Discussions

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Question
11:17
Asked by
Baroness Sheehan Portrait Baroness Sheehan
- Hansard - - - Excerpts

To ask His Majesty’s Government what discussions they have had with the Climate Change Committee about (1) the impact newly licensed oil and gas infrastructure will have on domestic and global emissions, and (2) the design of their ‘Climate Compatibility Checkpoint’.

Lord Callanan Portrait The Parliamentary Under-Secretary of State, Department for Energy Security and Net Zero (Lord Callanan) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Government work closely with the Climate Change Committee and are grateful for its expert independent advice. The committee provided advice on 24 February 2022 in relation to both new licensing and the climate compatibility checkpoint; the advice was published on the committee’s website. Officials also had several discussions with the committee throughout the design process for the checkpoint. Its advice was considered in the final design, which has now been published on the GOV.UK website.

Baroness Sheehan Portrait Baroness Sheehan (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the climate compatibility checkpoint, in reference to new oil and gas fields, is, quite frankly, doublethink in Orwellian proportions. Can the Minister confirm that the IEA, the IPCC, the vast bulk of UK scientists and the Government’s own net zero tsar, Chris Skidmore, have all stated that the opening of new fields is incompatible with keeping global warming within the 1.5 degree scenario necessary to protect us and the natural world from catastrophic climate breakdown?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

I do not agree with the noble Baroness. She is dead wrong about these matters. The reality is, whether the Liberal Democrats like it or not, that we get about 75% of our energy from oil and gas. That is declining, and the North Sea is a declining field. Unless she is proposing to tell voters that they should disconnect their gas boilers or not drive their cars anywhere, we have a requirement for oil and gas in the future, albeit for a declining amount. Therefore, the only question is whether we get them from our own fields and employ British workers, paying British taxes, or whether we import them from abroad, which usually has a higher carbon footprint. That is the choice that faces us.

Lord Deben Portrait Lord Deben (Con)
- View Speech - Hansard - - - Excerpts

Is my noble friend aware that the Government asked for the Climate Change Committee’s advice and then ignored it? First, the Climate Change Committee said that it was perfectly possibly to do this if there were a proper checkpoint. The checkpoint is not what we asked for. Secondly, the committee said that the Government should make sure that all extraction from the North Sea should be of the highest environmental level. We have not insisted on that. Norway has a much higher level. Thirdly, the committee said that the Government should accept that they should not increase the amount of oil being produced on the excuse of the war in Ukraine. Why have the Government not accepted the CCC’s advice?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

Let me give my noble friend some other quotes from the letter from the Climate Change Committee, with which he is of course closely associated:

“UK extraction has a relatively low carbon footprint (more clearly for gas than for oil) and the UK will continue to be a net importer of fossil fuels for the foreseeable future, implying there may be emissions advantages to UK production replacing imports”.


I think he should read the letter that he sent.

Baroness Blackstone Portrait Baroness Blackstone (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, what steps are the Government taking to reduce the impact of flaring? I am sure the Minister is aware that routine flaring, which incidentally has been banned in Norway since 1970, has a very bad effect on the environment, as it releases methane 80 times more potent than CO2 over a 20-year period. As a result, if Rosebank goes ahead, we will exceed our carbon budget.

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

As the noble Baroness is aware, we have a plan to reduce our flaring. We had a Question on that a few weeks ago. We have committed, along with many other countries, to eliminate flaring by 2030. The amount of flaring is declining rapidly across the North Sea and action is being taken.

Baroness Boycott Portrait Baroness Boycott (CB)
- View Speech - Hansard - - - Excerpts

My Lords, can I follow up on the last question? The Rosebank oilfield, which has just been licensed, is the largest undeveloped field in the North Sea. It is going to create 200 million tonnes of CO2, which is more than the combined annual emissions of all 28 low-income countries in the world. Most of the oil is going to be exported; it is not going to lower our domestic bills. Can the Government tell me what the benefits from this are? How on earth is this showing global leadership, at a time when all the institutions are saying that we have to stop extracting oil and gas to defeat climate change and temperature rise?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

I refer the noble Baroness to the answer I gave to the noble Baroness, Lady Sheehan. We still have in this country a requirement for oil and gas. Some 80% of our space heating comes from gas. We need to phase that out in a transition. Over the years, we need to electrify more, but in the short term we have a requirement for oil and gas. The question is whether we want to get it from Qatar or Saudi Arabia and pay taxes abroad, or employ our own people in the North Sea to extract those same reserves?

Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the Dasgupta review commissioned by the Treasury warned against the continued use of subsidies towards fossil fuels because they are driving biodiversity loss. Before the Minister says that they do not subsidise them, there are tax breaks, investment allowances and decommissioning loopholes—all of which are subsidies. What can the Minister say today about dealing with biodiversity loss and ending those subsidies towards fossil fuels?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

I am sorry to disappoint the noble Baroness but the Minister is going to say that we do not subsidise fossil fuels, because that is the case. In fact, the opposite is true. We gain billions of pounds per year in tax revenues from fossil fuels.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, would the Minister agree with the right honourable Member in the other place Chris Skidmore, the chair of the independent review of net zero, who has come out in opposition to the new Rosebank field development? He recently said:

“We must not let the industries of the past dictate our future”.

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

I actually agree with him on that particular statement. Of course we need to move towards phasing out fossil fuel use; nobody disagrees with that. We have a legal commitment to do that and we are doing so through a transition. As I said in response to previous questions, the question is where we get those reserves from in future. Even with new licensing, UK production in the North Sea will continue to decline at a rate of about 7% per year. At the moment we are importing LNG to satisfy our domestic demand, which has about twice the carbon footprint of that produced in the North Sea. I really do not understand the point the noble Baroness is making.

Baroness Blake of Leeds Portrait Baroness Blake of Leeds (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, the CCC’s report last month emphasised the need for decarbonising and expanding the electricity system to rapidly reduce the UK’s demand for fossil fuels. As mentioned in the report, the Government still have not provided a coherent strategy or essential details on how they will achieve their goal of decarbonisation by 2035. When will these be provided? When will the Government accept that the quickest and cheapest way to offer the required supply of variable renewables to do so will involve onshore wind and solar?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

Decarbonising our electricity system, which we are doing at the fastest rate of all G7 countries, will require much more electrification. Renewable generation capacity is currently six times greater than in 2010. We are expanding to deliver up to 50 gigawatts of offshore wind capacity by 2030. We have said that we will also consider onshore wind in future CfD rounds. We have one of the highest solar capacities in Europe as well—in fact, we have more solar capacity than even countries such as France.

Lord Newby Portrait Lord Newby (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the Minister’s defence of new exploration and production in the North Sea is that the carbon footprint of the oil and gas produced will be less because it will be consumed here. This goes against all the evidence. Can the Minister therefore give the House an assurance that all future production of oil and gas in the North Sea will be consumed in the UK in order to reap the benefits which he so repeatedly announces?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

The reason I said it was lower carbon intensity is that that is a fact. There are lots of studies being done on it. Imported LNG has about twice the carbon footprint of domestic production. Of course I cannot give him a guarantee that it will all be consumed within the UK, because it is an international market. We have pipelines, for instance, interlinking our gas supply with the continent, as the noble Lord well knows. If the Liberal Democrats really believe that we should stop our production tomorrow, I look forward to all the focus leaflets—which are being distributed at the moment—telling people that they have to stop using their gas boilers or driving their cars. Lots of leaflets are being produced but I have not noticed the Liberals saying that in public.

Baroness McIntosh of Hudnall Portrait Baroness McIntosh of Hudnall (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the Minister did answer the question on the impact on biodiversity of fossil fuel extraction. Could he have another go now?

Lord Callanan Portrait Lord Callanan (Con)
- View Speech - Hansard - - - Excerpts

Of course it has an impact on biodiversity, but we have very strict climate and environmental studies that need to be done before any fields are licensed. This is the subject of court action at the moment, as the noble Baroness probably realises, so I cannot comment on it in detail. We follow all the required biodiversity protocols.

Overseas Territories: Illegal Immigration

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Question
11:27
Asked by
Lord Lancaster of Kimbolton Portrait Lord Lancaster of Kimbolton
- Hansard - - - Excerpts

To ask His Majesty’s Government what plans they have, if any, to support the Overseas Territories in the Caribbean with the challenge of illegal immigration.

Lord Lancaster of Kimbolton Portrait Lord Lancaster of Kimbolton (Con)
- Hansard - - - Excerpts

I beg leave to ask the Question standing in my name on the Order Paper and remind your Lordships of my interest as the honorary colonel of the Cayman Islands Regiment.

Lord Goldsmith of Richmond Park Portrait The Minister of State, Foreign, Commonwealth and Development Office (Lord Goldsmith of Richmond Park) (Con)
- View Speech - Hansard - - - Excerpts

I salute my noble friend for his contribution to the overseas territories. The Prime Minister has been clear that supporting the overseas territories is a top priority for this Government. That includes supporting Caribbean overseas territories tackling irregular migration. I am working closely with colleagues across government to strengthen our collective support for the OTs. The Turks and Caicos Islands face particularly high levels of irregular migration from Haiti. The UK’s support package includes FCDO-funded work to introduce electronic borders and procuring a maritime surveillance aircraft.

Lord Lancaster of Kimbolton Portrait Lord Lancaster of Kimbolton (Con)
- View Speech - Hansard - - - Excerpts

My Lords, last month I visited the Turks and Caicos Islands with the Chief of the General Staff to see the work of the TCI regiment, which is supporting the countermigration challenges the islands face. It is a very real problem. So far this year, some 1,599 Haitians have been intercepted—which, for an island with a population of just 60,000, is an enormous challenge. Notwithstanding the work of my noble friend, who I know is committed to the OTs, I must say that I was underwhelmed by the response of His Majesty’s Government. It really is a challenge. The problem seems to be that other government departments here in the UK view the OTs as not their problem but an FCDO problem. However, the FCDO does not have the levers to pull to help the overseas territories, for example in policing. If the FCDO is unable to support the OTs, should responsibilities be transferred to the Cabinet Office to ensure a whole of government approach to supporting our overseas territories?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

My noble friend raises an important point; I know I am expected to say this, but I am genuinely grateful to him for raising this issue, which is not raised enough in this place. The problem he described is serious, but he is semi-right in relation to the FCDO. The FCDE is air traffic control for the OTs; the levers of delivery belong to other departments of government. But I pay tribute to the team in the FCDO, given that it is the department, notwithstanding what I just said, delivering the most for the OTs. We commissioned a serious crime review before the situation escalated in TCI, and urgently requested the deployment of UK police—and funded this. It is true, as has been noted, that UK police pulled their officers out and chose not to provide operational officers at the time they were needed. That was a mistake on their part, but the Foreign Office then secured further UK police capacity-building team and separately procured a 16-strong operational serious crime team for TCI through commercial routes, and that team is in place and making a big difference today. The FCDO also requested and funded the support of a Royal Navy helicopter at the height of the crisis in the TCI. The Foreign Secretary has been working with the Prime Minster and myself to ensure that all government departments understand their role in supporting the overseas territories. The noble Lord makes an important point that this is not someone else’s problem. The OTs are part of the UK family and the message has gone out from the Foreign Secretary and the Prime Minister, and to individual Ministers from me, that the Government need to step up across Whitehall.

Baroness Wheeler Portrait Baroness Wheeler (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the international medical charity MSF has underlined that gang violence has spread to every part of Port-au-Prince, displacing many residents who are now living in dire conditions. Hospitals, clinics and schools have been forced to close, worsening already appalling food shortages and limiting access to clean drinking water. What steps are the UK and UN taking to help address the violence and humanitarian situation, and to support those fleeing the country to find safety?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness for her question. Of course, Haiti is not an overseas territory, but it has a big impact on neighbouring overseas territories, as we have been discussing. We are obviously very concerned. We used our platform in the UN Security Council to support the UN sanctions back in October. We continue to engage in Security Council discussions, including considering Haitian requests for security assistance, and we want stability and security as soon as possible in Haiti. We are supporting it through contributions to the UN and other international agencies that have a strong presence on the ground, including the World Bank, and we are working with the UN office in Haiti and the international community to support a peaceful, democratic and Haitian-led solution for the Haitian people.

Lord Purvis of Tweed Portrait Lord Purvis of Tweed (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the Minister knows that the OTs operate their own visa regime, which is separate from that of the United Kingdom. Given the violence and climate crises in that near neighbourhood, there are no safe and legal routes for seeking asylum. Are the OTs fully covered by the proposals in the Government’s Illegal Migration Bill, which means that they will now have to detain and then remove to Rwanda any of those individuals? What are the mechanisms for providing support for detention facilities within the OTs and supporting the cost of flights to Rwanda that the Government are now going to insist the OTs carry out? What was their response? I assume the Government and FCDO consulted them. What was the response to the consultation?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

Different OTs have different challenges and problems. We began the conversation about TCI, where the migration problem is on a scale that is incomparable. If it was translated into UK conditions, it would be like 4 million or 5 million people crossing the channel every year, and clearly that is a major problem for a small island with a small population. What we are helping to do in TCI is helping the country return those refugees to Haiti where possible. Similarly, we have a problem in Cayman, where large numbers of people are fleeing from Cuba. The answer there is to return people wherever possible to Cuba. The only issue that seems to be of interest to Parliament at the moment relates to the British Indian Ocean Territory, where we have a particular problem with refugees, mostly from Sri Lanka, who are inhabiting an area that is effectively uninhabitable. There we have particular issues and it is in those circumstances where the Rwanda option may be the best one.

Lord Bellingham Portrait Lord Bellingham (Con)
- View Speech - Hansard - - - Excerpts

My Lords, when I had the good fortune to do the Minister’s job in the other place, I was able to visit most of the OTs. One of the consistent themes was the lack of capacity, experience and training across the Governments of the OTs. One way to address this is to put in place twinning arrangements with local authorities in the UK. One such partnership was between the TCI and Hertfordshire County Council to exchange and train staff, move people and embed them and, above all, build that crucial capacity. Is that twinning arrangement still going? What plans do he and HMG have to put in place further such arrangements?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

There are actually quite a few arrangements of the sort that the noble Lord describes—on education, policing and a wide range of issues. There are too many for me to regale now in the short time that we have, but I am happy to write to him and detail some of the most effective arrangements in place. I would emphasise the point made in the original Question. Different government departments need to recognise that we have a constitutional responsibility to the overseas territories. While the FCDO is a key central organisation in ensuring that that delivery happens, different government departments need to recognise that the inhabitants of the overseas territories are no less His Majesty’s subjects than we are in this place.

Lord Boateng Portrait Lord Boateng (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, HMS “Medway” was deployed to very considerable and very good effect in the Caribbean in 2022. Why cannot it or a vehicle of a similar class be deployed in the Caribbean in support of the overseas dependencies in 2023? If it cannot, is that not a good argument for having a permanent naval presence in the Caribbean?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

The noble Lord makes a good point and I agree with him that HMS “Medway” and the auxiliary ship RFA “Tideforce” were of huge assistance in the Turks and Caicos Islands in the wake of Hurricane Fiona. “Medway” then supported the Cayman Islands in response to Hurricane Ian. HMS “Dauntless” will be in the region from 1 June this year to provide a consistent maritime presence in the Caribbean, including humanitarian assistance and disaster response support. It is our intention and duty to ensure we have that presence when needed, particularly during the hurricane season.

Lord Anderson of Swansea Portrait Lord Anderson of Swansea (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, if flights to Rwanda are an option for the Caribbean, overseas territories and beyond, who picks up the bill?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

The noble Lord raises what I think is currently an academic question. The Rwanda option is being explored in relation to the refugees I mentioned earlier who have landed in Chagos—Diego Garcia. We have a particular issue there, given that the facilities are not appropriate. The area that the refugees currently occupy is not strictly inhabitable and we need to return as many of those people as possible. I would add 130 individuals have already voluntarily returned home and the numbers are now pretty small.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, on the issue of assistance from HMG to the overseas territories, can the Minister confirm that carbon emissions from overseas territories count under the UK’s net-zero target? What support are the Government providing to those overseas territories to tackle their carbon emissions?

Lord Goldsmith of Richmond Park Portrait Lord Goldsmith of Richmond Park (Con)
- View Speech - Hansard - - - Excerpts

The key value of the overseas territories is related less to carbon—their emissions are minuscule—than to the fact that 96% of UK biodiversity is in the overseas territories. That is an enormous source of pride for the UK, and rightly so. We provide a lot of financial support through Darwin Plus, which we expanded to £10 million annually. We have £2 million also available this year to the OTs through the CSSF. We have the Blue Belt programme, which has grown—Anguilla joined a few months ago and another overseas territory will be joining. I long to tell the House about that but I cannot do so yet. That programme continues to grow. We are focusing a lot of effort and energy in helping the OTs to protect and enhance their biodiversity. I did not answer the question about whether emissions are included, because I am afraid that I do not know the answer. My colleague here no doubt does.

Rt Hon Dominic Raab MP: Resignation Letter

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Question
11:40
Asked by
Lord Davies of Brixton Portrait Lord Davies of Brixton
- Hansard - - - Excerpts

To ask His Majesty’s Government what action, if any, they are considering following the comments made about civil servants by the Rt Hon Dominic Raab MP in his resignation letter to the Prime Minister dated 21 April.

Baroness Neville-Rolfe Portrait The Minister of State, Cabinet Office (Baroness Neville-Rolfe) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the Prime Minister has been clear that the Civil Service is vital to the work of the Government. The Government greatly value the work of civil servants who, together with Ministers, are working to deliver for the British people. The Prime Minister has accepted the resignation of the right honourable Dominic Raab, the former Deputy Prime Minister, following the findings of Adam Tolley KC, in a published exchange of letters.

Lord Davies of Brixton Portrait Lord Davies of Brixton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome much of the Minister’s reply, but does she accept that the emerging pattern we see is not civil servants conspiring against their Ministers? The pattern documented is of Conservative Ministers bullying their staff, with three examples in the current Parliament, two of which led to resignations and one of which should have led to a resignation.

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

I cannot accept the conclusion of the noble Lord. Of course, as the Prime Minister said, we need to learn from these cases

“how to better handle such matters better in future”,

and a credible complaints process needs to have the confidence of Ministers and civil servants alike. Work is under way on that. Ministers and civil servants work together on difficult issues every day and, in the main, very constructively.

Lord Fowler Portrait Lord Fowler (CB)
- View Speech - Hansard - - - Excerpts

My Lords, as someone who headed four separate departments, all under Conservative Governments, in my experience overwhelmingly the Civil Service was loyal and gave exceptional advice to the Government. Would it not be better to look at the quality of special advisers, who sometimes exhibit neither of those qualities?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

Having worked as an adviser, a Minister and a civil servant, I would say that the constitution has these different parts. Political advisers are important and helpful. In most cases, they work well with the Civil Service.

Lord Lilley Portrait Lord Lilley (Con)
- View Speech - Hansard - - - Excerpts

My Lords, is it not important to recognise that Ministers have no power to select, reward, promote or demote officials working for them? Likewise, officials should not have the power effectively to dismiss Ministers for whom they work, least of all by making anonymous complaints against them. I was very fortunate, like the noble Lord, Lord Fowler, that my officials were a joy to work with throughout, but some Ministers have perceived some officials to be reluctant to implement their policies and have had to try to find ways of dealing with that, and some officials have perceived Ministers’ responses trying to get them to do that too abrasive, demanding and rude. I sympathise with those who had to duck telephones thrown by Gordon Brown or to deal with Richard Crossman, who said in his diaries that when he found officials reluctant to do his will:

“I bullied them and made a fool of them in front of others, quite often their subordinates”.


I suspect such an approach was counterproductive. Does the Minister agree that it is up to the electorate or elected superiors to get rid of Ministers who cannot deliver, not officials?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

Ministers are of course part of the process of democratic election. I agree with much of what my noble friend said.

Lord Newby Portrait Lord Newby (LD)
- View Speech - Hansard - - - Excerpts

My Lords, in his letter of resignation, the former Deputy Prime Minister said that the inquiry into his actions

“set a dangerous precedent for the conduct of good government”,

and set the “threshold for bullying” too low. The Prime Minister in response said that we should learn to manage these matters better in future. Does the Minister agree that the threat to good government comes not from the inquiry but from bullying Ministers, that the threshold which needs to be raised is that of ministerial behaviour, and that the lesson to be learned is that Ministers should behave themselves and not bully their staff?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

Ministers are required to behave themselves and do behave themselves. The code includes the statement:

“Harassing, bullying or other inappropriate or discriminating behaviour wherever it takes place is not consistent with the Ministerial Code and will not be tolerated”.


Complaints are investigated, as we have been discussing.

Lord McDonald of Salford Portrait Lord McDonald of Salford (CB)
- View Speech - Hansard - - - Excerpts

In his report, Mr Tolley took care to anonymise all the complainants. Reading the report, it was not possible to see who had complained. In his resignation letter, the former Deputy Prime Minister mentioned a Gibraltar negotiation and then someone leaked the name of the British ambassador to Spain to the Telegraph. Will His Majesty’s Government condemn that leak?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

I read the Tolley report. He took great care on this matter. Where there are specific allegations, it can be very difficult to guarantee anonymity in a process like this. It is important for fairness that the full details of the complaint are made. Although the Deputy Prime Minister stepped down and there were findings of concern, there were also areas where Mr Tolley took a different view.

Baroness Chapman of Darlington Portrait Baroness Chapman of Darlington (Lab)
- View Speech - Hansard - - - Excerpts

The Minister is choosing her words carefully, and she has our sympathy for that, but the extraordinarily poor grace of Mr Raab’s resignation letter means that this case has failed to clarify the standards expected of Ministers.

“The conclusion of the Raab inquiry has done nothing to help other ministers who misunderstand what professional behaviour looks like avoid getting into the same position”.


Those are not my words but the words of the Institute for Government. Is it not time that the Government introduced an independent adviser with the power to initiate investigations? Should there not also be an independent review of the effectiveness of the Ministerial Code?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

I should point out that in his letter, Dominic Raab, who did some good things as a Secretary of State, said:

“I am genuinely sorry for any unintended stress or offence that any officials felt”.


An independent adviser, Mr Tolley, was asked to conduct the inquiry because at that time there was no ethics adviser, as the noble Baroness knows. Sir Laurie Magnus has since been appointed. He can initiate, but he has to get the approval of the Prime Minister. As we discussed on Tuesday, the arrangements have been changed and the process shows that, where there is need for an inquiry, an inquiry takes place.

Lord Young of Cookham Portrait Lord Young of Cookham (Con)
- View Speech - Hansard - - - Excerpts

My Lords, looking around, I see many noble Lords who have had more successful ministerial experiences than mine, but none who lasted 21 years. My experience is that you do not get the best out of civil servants by shouting at them. There is no organised conspiracy to frustrate the will of Ministers, but some Ministers may see as obstruction civil servants doing their job by pointing out the adverse consequences of certain policy options. If we have a review of the complaints procedure, can we debate it in this House so the plethora of ex-Ministers, ex-civil servants and others can contribute to that review?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

I think almost no Secretary of State has been as successful as my noble friend, and he has helped here as well by joining the Front Bench. What we debate in this House is a matter for the usual channels, but we are getting on and work is under way on the complaints process.

Lord Bird Portrait Lord Bird (CB)
- View Speech - Hansard - - - Excerpts

My experience, having spoken to a number of Ministers, is that a couple of them have said things like, “You won’t get this past the Civil Service”. What does that mean?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

I do not dare to speculate on what the thing in question was. The Civil Service has a fundamental principle of political impartiality so, in considering proposals, that is something they have to look at. If something is improper, then the good civil servant—I used to be one—will point that out to the Minister of the day, and it might be that that is what was meant. Obviously Ministers are advised by civil servants on matters of policy, and it is clear that civil servants sometimes disagree with Ministers.

Baroness Donaghy Portrait Baroness Donaghy (Lab)
- View Speech - Hansard - - - Excerpts

I once asked a senior civil servant who were their favourite Ministers to work with. In confidence, they said Nicholas Ridley and the noble Lord, Lord Mandelson—which in itself is an interesting combination. I asked why, and they said it was because you knew where you stood with them and they were decisive. I think that is the definition of a good Minister. I have never met a civil servant who was disloyal, but I have met people who say that they would rather not receive direct instructions via a spad and would rather speak to a Minister. I think that is not necessarily because of the quality of the spad, but because of the method of avoiding talking to civil servants. Does the Minister agree?

Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- View Speech - Hansard - - - Excerpts

The noble Baroness makes a very good point. These are the sort of points that come up when we debate these things. Good Ministers decide clearly, and civil servants and political spads provide advice, which can be helpful. Spads can indeed be helpful to civil servants, as I remember.

House of Lords Commission, Services, Liaison, Procedure and Privileges and Selection Committees

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
Membership Motion
11:50
Moved by
Lord Gardiner of Kimble Portrait The Senior Deputy Speaker (Lord Gardiner of Kimble)
- Hansard - - - Excerpts

That the Earl of Kinnoull be appointed to the following Select Committees, in place of Lord Judge:

House of Lords Commission

Services

Liaison

Procedure and Privileges

Selection.

Lord Gardiner of Kimble Portrait The Senior Deputy Speaker (Lord Gardiner of Kimble)
- Hansard - - - Excerpts

My Lords, in moving the five Motions standing in my name on the Order Paper, I express my considerable thanks to the noble and learned Lord, Lord Judge, on behalf of the whole House for his work as Convenor of the Cross Benches over the past three and a half years, and recognise his service and significant contribution to those committees on which he served as convenor. We look forward to the noble and learned Lord’s return to the House in due course. I beg to move.

Motion agreed.

Artificial Intelligence in Weapon Systems Committee

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
Membership Motion
11:50
Moved by
Lord Gardiner of Kimble Portrait The Senior Deputy Speaker
- Hansard - - - Excerpts

That Lord Mitchell and Lord Triesman be appointed members of the Select Committee in place of Baroness Anderson of Stoke-on-Trent and Baroness Symons of Vernham Dean.

Motion agreed.

Built Environment Committee

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
Membership Motion
11:50
Moved by
Lord Gardiner of Kimble Portrait The Senior Deputy Speaker
- Hansard - - - Excerpts

That Baroness Warwick of Undercliffe be appointed a member of the Select Committee.

Motion agreed.

Constitution Committee

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
Membership Motion
11:50
Moved by
Lord Gardiner of Kimble Portrait The Senior Deputy Speaker
- Hansard - - - Excerpts

That Baroness Finn be appointed a member of the Select Committee in place of Lord Howard of Lympne.

Motion agreed.

Science and Technology Committee

Thursday 27th April 2023

(1 year ago)

Lords Chamber
Read Full debate Read Hansard Text
Membership Motion
11:50
Moved by
Lord Gardiner of Kimble Portrait Lord Gardiner of Kimble
- Hansard - - - Excerpts

That Viscount Stansgate be appointed a member of the Select Committee in place of Baroness Warwick of Undercliffe.

Motion agreed.
Committee (3rd Day)
11:51
Relevant document: 28th Report from the Delegated Powers Committee
Clause 6: Providers of user-to-user services: duties of care
Amendment 13
Moved by
13: Clause 6, page 5, line 33, after “services” insert “that are not Category 2A services”
Member’s explanatory statement
This amendment is consequential on other amendments in the name of Lord Moylan to remove Clause 23(3) and the subsequent new Clause after 23, the effect of which is that the duties imposed on search services vary depending on whether or not they are Category 2A services: this needs to be reflected in the provision about combined services (regulated user-to-user services that include public search services) in Clause 6.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, in moving my Amendment 13 I will speak to all the amendments in the group, all of which are in my name with the exception of Amendment 157 in the name of my noble friend Lord Pickles. These are interlinked amendments; they work together. There is effectively only one amendment going on. A noble Lord challenged me a day or two ago as to whether I could summarise in a sentence what the amendment does, and the answer is that I think I can: Clause 23 imposes various duties on search engines, and this amendment would remove one of those duties from search engines that fall into category 2B.

There are two categories of search engines, 2A and 2B, and category 2B is the smaller search engines. We do not know the difference between them in greater detail than that because the schedule that relates to them reserves to the Secretary of State the power to set the thresholds that will define which category a search engine falls into, but I think it is clear that category 2B is the smaller ones.

These amendments pursue a theme that I brought up in Committee earlier in the week when I argued that the Bill would put excessively onerous and unnecessary obligations on smaller businesses. The particular duty that these amendments would take away from smaller search engines is referred to in Clause 23(2):

“A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service”.


The purpose of that is to recognise that very large numbers of smaller businesses do not pose a risk, according to the Government’s own assessment of the market, and to allow them to get on with their business without taking these onerous and difficult measures. They are probing amendments to try to find out what the Government are willing to do in relation to smaller businesses that will make this a workable Bill.

I can already imagine that there are noble Lords in the Chamber who will say that small does not equal safe, and that small businesses need to be covered by the same rigorous regulations as larger businesses. But I am not saying that small equals safe. I am saying—as I attempted to say when the Committee met earlier—that absolute safety is not attainable. It is not attainable in the real world, nor can we expect it to be attainable in the online world. I imagine that objection will be made. I see it has some force, but I do not think it has sufficient compelling force to put the sort of burden on small businesses that this Bill would do, and I would like to hear more about it.

I will say one other thing. Those who object to this approach need to be sure in their own minds that they are not contributing to creating a piece of legislation that, when it comes into operation, is so difficult to implement that it becomes discredited. There needs to be a recognition that this has to work in practice. If it does not—if it creates resentment and opposition—we will find the Government not bringing sections of it into force, needing to repeal them or going easy on them once the blowback starts, so to speak. With that, I beg to move.

Baroness Deech Portrait Baroness Deech (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 157 in the name of the noble Lord, Lord Pickles, and others, since the noble Lord is unavoidably absent. It is along the same lines as Amendment 13; it is relatively minor and straightforward, and asks the Government to recognise that search services such as Google are enormously important as an entry to the internet. They are different from social media companies such as Twitter. We ask that the Government be consistent in applying their stated terms when these are breached in respect of harm to users, whether that be through algorithms, through auto-prompts or otherwise.

As noble Lords will be aware, the Bill treats user-to-user services, such as Meta, and search services, such as Google, differently. The so-called third shield or toggle proposed for shielding users from legal but harmful content, should they wish to be shielded, does not apply when it comes to search services, important though they are. Indeed, at present, large, traditional search services, including Google and Microsoft Bing, and voice search assistants, including Alexa and Siri, will be exempted from several of the requirements for large user-to-user services—category 1 companies. Why the discrepancy? Though search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars—the systems they design and employ—are their responsibility, and these have been proven to do harm.

Some of the examples of such harm have already been cited in the other place, but not before this Committee. I do not want to give them too much of an airing because they were in the past, and the search people have taken them down after complaints, but some of the dreadful things that emerge from searching on Google et cetera are a warning of what could occur. It has been pointed out that search engines would in the past have thrown up, for example, swastikas, SS bolts and other Nazi memorabilia when people searched for desk ornaments. If George Soros’s name came up, he would be included in a list of people responsible for world evils. The Bing service, which I dislike anyway, has been directing people—at last, it did in the past—to anti-Semitic and homophobic searches through its auto-complete, while Google’s image carousel highlighted pictures of portable barbecues to those searching for the term “Jewish baby stroller”.

12:00
These search engines, which are larger than some countries in terms of the funds they raise, should be treated in the same way as Meta, Twitter and others, knowing the harm that their systems can cause. The Joint Committee on the draft Bill, and Ministers in meetings with the APPG Against Antisemitism, have been clear that this is an issue and recognised that it needs addressing. I hope the Minister will agree that our amendment, or perhaps one similar to it that the Government might care to introduce in the next stages, would be a small, smart and meaningful technical fix to the Bill in addressing the unnecessary imbalance that allows major search companies to avoid protecting the public to the full extent that we, in the Bill, expect of other large companies. I hope that the Minister will agree to meet interested parties and to do the sensible and right thing about search engines.
Lord Weir of Ballyholme Portrait Lord Weir of Ballyholme (DUP)
- View Speech - Hansard - - - Excerpts

My Lords, I also support Amendment 157, which stands in the name of the noble Lord, Lord Pickles, and others, including my own. As the noble Baroness, Lady Deech, indicated, it is specific in the nature of what it concentrates on. The greatest concern that arises through the amendment is with reference to category 2A. It is not necessarily incompatible with what the noble Lord, Lord Moylan, proposes; I do not intend to make any direct further comment on his amendments. While the amendment is specific, it has a resonance with some of the other issues raised on the Bill.

I am sure that everyone within this Committee would want to have a Bill that is as fit for purpose as possible. The Bill was given widespread support at Second Reading, so there is a determination across the Chamber to have that. Where we can make improvements to the Bill, we should do that and, as much as possible, try to future-proof the Bill. The wider resonance is the concern that if the Bill is to be successful, we need as much consistency and clarity within it as possible, particularly for users. Where we have a level of false dichotomy of regulations, that runs contrary to the intended purposes of the Bill and creates inadvertent opportunities for loopholes. As such, and as has been indicated, the concern is that in the Bill at present, major search engines are effectively treated in some of the regulations on a different basis from face-to-face users. For example, some of the provisions around risk assessment, the third shield and the empowerment tools are different.

As also indicated, we are not talking about some of the minor search engines. We are talking about some of the largest companies in the world, be it Google, Microsoft through Bing, Amazon through its devices or Apple through its Siri voice tool, so it is reasonable that they are brought into line with what is there is for face-to-face users. The amendment is therefore appropriate and the rationale for it is that there is a real-world danger. Mention has been made—we do not want to dwell too long on some of the examples, but I will use just one—of the realms of anti-Semitism, where I have a particular interest. For example, on search tools, a while ago there was a prompt within one search engine that Jews are evil. It was found that when that prompt was there, searches of that nature increased by 10% and when it was removed, they were reduced. It is quite fixable and it goes into a wide range of areas.

One of the ways in which technology has changed, I think for us all, is the danger that it can be abused by people who seek to radicalise others and make them extreme, particularly young children. Gone are the days when some of these extremists or terrorists were lonely individuals in an attic, with no real contact with the outside world, or hanging around occasionally in the high street while handing out poorly produced A4 papers with their hateful ideology. There is a global interconnection here and, in particular, search engines and face-to-face users can be used to try to draw young people into their nefarious activities.

I mentioned the example of extremism and radicalisation when it comes to anti-Semitism. I have seen it from my own part of the world, where there is at times an attempt by those who still see violence as the way forward in Northern Ireland to draw new generations of young people into extremist ideology and terrorist acts. There is an attempt to lure in young people and, sadly, search engines have a role within that, which is why we need to see that level of protection. Now, the argument from search engines is that they should have some level of exemptions. How can they be held responsible for everything that appears through their searches, or indeed through the web? But in terms of content, the same argument could be used for face-to-face users. It is right, as the proposer of this amendment has indicated, that there are things such as algorithmic indexing and prompt searches where they do have a level of control.

The use of algorithms has moved on considerably since my schooldays, as they surely have for everyone in this Committee, and I suspect that none of us felt that they would be used in such a fashion. We need a level of protection through an amendment such as this and, as its proposers, we are not doctrinaire on the precise form in which this should take place. We look, for example, at the provisions within Clause 11—we seek to hear what the Government have to say on that—which could potentially be used to regulate search engines. Ensuring that that power is given, and will be used by Ofcom, will go a long way to addressing many of the concerns.

I think all of us in this Committee are keen to work together to find the right solutions, but we feel that there is a need to make some level of change to the regulations that are required for search engines. None of us in this Committee believes that we will ultimately have a piece of legislation that reflects perfection, but there is a solemn duty on us all to produce legislation that is as fit for purpose and future-proofed as possible, while providing children in particular with the maximum protection in what is at times an ever-changing and sometimes very frightening world.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.

I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.

Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.

The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I did not want to interrupt the noble Lord, Lord Moylan, in full flow as he introduced the amendments, but I believe he made an error in terms of the categorisation. The error is entirely rational, because he took the logical position rather than the one in the Bill. It is a helpful error because it allows us to quiz the Minister on the rationale for the categorisation scheme.

As I read it, in Clause 86, the categories are: category 1, which is large user-to-user services; category 2A, which is search or combined services; and category 2B, which is small user-to-user services. To my boring and logical binary brain, I would expect it to be: “1A: large user-to-user”; “1B: small user-to-user”; “2A: large search”; and “2B: small search”. I am curious about why a scheme like that was not adopted and we have ended up with something quite complicated. It is not only that: we now have this Part 3/Part 5 thing. I feel that we will be confused for years to come: we will be deciding whether something is a Part 3 2B service or a Part 5 service, and we will end up with a soup of numbers and letters that do not conform to any normal, rational approach to the world.

12:15
I am sure that a rationale does underlie that—the people who wrote the legislation will of course have come up with the schema for a reason—but it is important to push on that, because we want our legislation to be intelligible to people out there. Again, it would be entirely logical to have a schema that says, “1A: large user-to-user; 1B: small user-to-user; 2A: large search; 2B: small search; and 3: pornographic”. If the noble Baroness, Lady Kidron, has her way and we add extra services, we could make them categories 4 and 5, and we could have categories 4A and 4B.
Again, I hope that the Minister can take this opportunity to respond on the substance of whether there should be different requirements and to explain why we have that categorisation, where category 2B is small user-to-user services, category 1 is big user-to-user services and category 2A is search and combined services. That would probably not be the first assumption of most people in the House, and it has been bugging me since I first read the Bill, so it would be nice to get an answer today.
Baroness Merron Portrait Baroness Merron (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this debate, which revisits some of the areas discussed in earlier debates about the scope of the Bill, as many noble Lords said. It allows your Lordships’ House to consider what has to be the primary driver for assessment. In my view and as others said, it ought to be about risk, which has to be the absolute driver in all this. As the noble Baroness, Lady Harding, said, businesses do not remain static: they start at a certain size and then change. Of course, we hope that many of the businesses we are talking about will grow, so this is about preparation for growth and the reality of doing businesses.

As we discussed, there certainly are cases where search providers may, by their very nature, be almost immune from presenting users with content that could be considered either harmful or illegal under this legislative framework. The new clause proposed by the noble Lord, Lord Moylan—I am grateful to him for allowing us to explore these matters—and its various consequential amendments, would limit the duty to prevent access to illegal content to core category 2A search providers, rather than all search providers, as is currently the case under Clause 23(3).

The argument that I believe the noble Lord, Lord Moylan, put forward is that the illegal content duty is unduly wide, placing a disproportionate and otherwise unacceptable burden on smaller and/or supposedly safer search providers. He clearly said he was not saying that small was safe—that is now completely understood—but he also said that absolute safety is not achievable. As the noble Baroness, Lady Kidron, said, that is indeed so. If this legislation is too complex and creates the wrong provisions, we will clearly be a long way away from our ambition, which here has to be to have in place the best legislative framework, one that everyone can work with and that provides the maximum opportunity for safety and what we all seek to achieve.

Of course, the flip side of the argument about an unacceptable burden on smaller, or on supposedly safer, search providers may be that they would in fact have very little work to do to comply with the illegal content duty, at least in the short term. But the duty would act as an important safeguard, should the provider’s usual systems prove ineffective with the passage of time. Again, that point was emphasised in this and the previous debate by the noble Baroness, Lady Harding.

We look forward to the Minister’s response to find out which view he and his department subscribe to or, indeed, whether they have another view they can bring to your Lordships’ House. But, on the face of it, the current arrangements do not appear unacceptably onerous.

Amendment 157 in the name of the noble Lord, Lord Pickles, and introduced by the noble Baroness, Lady Deech, deals with search by a different approach by inserting requirements about search services’ publicly available statements into Clause 65. In the debate, the noble Baroness and the noble Lord, Lord Weir, raised very important, realistic examples of where search engines can take us, including to material that encourages racism directed at Jews and other groups and encourages hatred of various groups, including Jews. The amendment talks about issues such as the changing of algorithms or the hiding of content and the need to ensure that the terms of providers’ publicly available statements are applied as consistently.

I look forward to hearing from the Minister in response to Amendment 157 as the tech certainly moves us beyond questions of scope and towards discussion of the conduct of platforms when harm is identified.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.

Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.

Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.

The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.

The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.

Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.

The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious

“To be, or not to be”


pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.

Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.

As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.

My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.

I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.

12:30
Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.

I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.

I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.

I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.

On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.

There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.

So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.

While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.

I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.

Amendment 13 withdrawn.
Baroness Deech Portrait Baroness Deech (CB)
- View Speech - Hansard - - - Excerpts

Having listened to the Minister, I think we need clarification on the issue of duplication and what is illegal as opposed to just harmful. If we can clarify that, I shall not move my Amendment 157.

Lord Beith Portrait The Deputy Chairman of Committees (Lord Beith) (LD)
- Hansard - - - Excerpts

When we come to Amendment 157, that will be noted.

Amendments 13A to 13C

Moved by
13A: Clause 6, page 5, line 35, after “service” insert “is not a Category 2A service and”
Member’s explanatory statement
This technical amendment ensures that the duties imposed on providers of combined services in relation to the search engine are correct following the changes to clause 20 arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only.
13B: Clause 6, page 5, line 37, after “service” insert “is not a Category 2A service and”
Member’s explanatory statement
This technical amendment ensures that the duties imposed on providers of combined services in relation to the search engine are correct following the changes to clause 20 arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only.
13C: Clause 6, page 5, line 38, at end insert—
“(c) if the service is a Category 2A service not likely to be accessed by children, the duties set out in Chapter 3 referred to in section 20(2) and (3A);(d) if the service is a Category 2A service likely to be accessed by children, the duties set out in Chapter 3 referred to in section 20(2), (3) and (3A).”Member’s explanatory statement
This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 23, 25 and 29 below (duties to summarise risk assessments in a publicly available statement and to supply records of risk assessments to OFCOM) are imposed on providers of combined services that are Category 2A services in relation to the search engine.
Amendments 13A to 13C agreed.
Amendment 14
Moved by
14: Clause 6, page 5, line 38, at end insert—
“(6A) Providers of regulated user-to-user services are required to comply with duties under subsections (2) to (6) for each such service which they provide to the extent that is proportionate and technically feasible without making fundamental changes to the nature of the service (for example, by removing or weakening end-to-end encryption on an end-to-end encrypted service).”Member’s explanatory statement
This amendment is part of a series of amendments by Lord Clement-Jones intended to ensure risk assessments are not used as a tool to undermine users’ privacy and security.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I propose Amendment 14 on behalf of my noble friend Lord Clement-Jones and the noble Lord, Lord Hunt of Kings Heath, who are not able to be present today due to prior commitments. I notice that the amendment has been signed also by the noble Baroness, Lady Fox, who I am sure will speak to it herself. I shall speak to the group of amendments as a whole.

I shall need to speak at some length to this group, as it covers some quite complex issues, even for this Bill, but I hope that the Committee will agree that this is appropriate given the amendments’ importance. I also expect that this is one area where noble Lords are receiving the most lobbying from different directions, so we should do it justice in our Committee.

We should start with a short summary of the concern that lies behind the amendments: that the Bill, as drafted, particularly under Clause 110, grants Ofcom the power to issue technical notices to online services that could, either explicitly or implicitly, require them to remove privacy protections—and, in particular, that this could undermine a technology that is increasingly being deployed on private messaging services called end-to-end encryption. The amendments in this group use various mechanisms to reduce the likelihood of that being an outcome. Amendments 14 and 108 seek to make it clear in the Bill that end-to-end encryption would be out of scope—and, as I understand it, Amendment 205, tabled by the noble Lord, Lord Moylan, seeks to do something similar.

A second set of amendments would add in extra controls over the issuing of technical notices. While not explicitly saying that these could not target E2EE—if noble Lords will excuse the double negative—they would make it less likely by ensuring that there is more scrutiny. They include a whole series of amendments—Amendments 202 and 206, tabled by the noble Lord, Lord Stevenson, and Amendment 207—that have the effect of ensuring that there is more scrutiny and input into issuing such a notice.

The third set of amendments aim to ensure that Ofcom gives weight more generally to privacy and to all the actions it takes in relation to it. In particular, Amendment 190 talks about a broader privacy duty, and Amendment 285—which I think noble Lord, Lord Moylan, will be excited about—seeks to restrict general monitoring.

I will now dig into why this is important. Put simply, there is a risk that under the Bill a range of internet services will feel that they are unable to offer their products in the UK. This speaks to a larger question as we debate the measures in the Bill, as it can sometimes feel as though we are comfortable ratcheting up the requirements in the Bill under the assumption that services will have no choice but to meet them and carry on. While online services will not have a choice about complying if they wish to be lawfully present in the UK, they will be free to exit the market altogether if they believe that the requirements are excessively onerous or impossible to meet.

In the Bill, we are constructing, in effect, a de facto licensing mechanism, where Ofcom will contact in-scope services—the category 2A, category 2B, Part 3 and Part 5 services we discussed in relation to the previous group of amendments—will order them to follow all the relevant regulation and guidance and will instruct them to pay a fee for that supervision. We have to consider that some services, on receipt of that notice, will take steps to restrict access by people in the UK rather than agree to such a licence. Where those are rogue services, this reaction is consistent with the aims of the Bill. We do not want services which are careless about online safety to be present in the UK market. But I do not believe that it is our aim to force mainstream services out of the UK market and, if there is a chance of that happening, it should give us pause for thought.

As a general rule, I am not given to apocalyptic warnings, but I believe there is a real risk that some of the concerns that noble Lords will be receiving in their inboxes are genuine, so I want to unpick why that may be the case. We should reflect for a moment on the assumptions we may have about the people involved in this debate and their motivations. We often see tech people characterised as oblivious to harms, and security services people as uncaring about human rights. In my experience, both caricatures are off the mark, as tech people hate to see their services abused and security service representatives understand that they need to be careful about how they exercise the great powers we have given them. We should note that, much of the time, those two communities work well together in spaces such the Global Internet Forum to Counter Terrorism.

If this characterisation is accurate, why do I think we may have a breakdown over the specific technology of end-to-end encryption? To understand this subject, we need to spend a few moments looking at trends in technology and regulation over recent years. First, we can look at the growth of content-scanning tools, which I think may have been in the Government’s mind when they framed and drafted the new Clause 110 notices. As social media services developed, they had to consider the risks of hosting content on the services that users had uploaded. That content could be illegal in all sorts of ways, including serious forms, such as child sexual abuse material and terrorist threats, as well as things such as copyright infringement, defamatory remarks and so on. Platforms have strong incentives to keep that material off their servers for both moral and legal reasons, so they began to develop and deploy a range of tools to identify and remove it. As a minimum, most large platforms now deploy systems to capture child sexual abuse material and copyright-infringing material, using technologies such as PhotoDNA and Audible Magic.

12:45
I stress again that, in the context of our debate on these amendments, a key element in the rationale for deploying these tools voluntarily—not because they are required to do so by law—is the fact that social media services are acting as hosts for content on their servers, so they feel partially liable for it; in fact, in legal terms, they may well be strictly liable for it. By contrast, modern private messaging services tend to have quite a different architecture, where the provider does not host content on its servers but simply moves it from one device on the network to another. There are some exceptions to that with legacy services, such as Facebook Messenger and the caching of large files—we could go into that subject, if noble Lords are interested. But the key point is that there has been a trend towards more functionality at the edge—namely, on the device in your pocket—as we move from classic social media, which depended on servers, to messaging. That distinction is critical when we consider what is commonly referred to as client-side scanning. The scanning that takes place today generally takes place on platform servers on content they are hosting themselves. The introduction of scanning on to people’s own devices is a different beast in technical, legal and ethical terms; I am sure we will want to tease that out in the debate.
The second trend we have seen is the concern over government surveillance. Back in the day, we may have been comfortable with the security services having a desk in the telephone exchange or asking their mate Bob, who does the filing at some company, to pass them information about a dodgy character—but the landscape has shifted. The Snowden revelations triggered a huge debate about the reach of Governments into our online lives—even those whom we think are on our side, such as the UK Government or the US Government—and we are increasingly concerned about foreign surveillance at home, to the extent that we are willing to spend a fortune pulling Huawei devices out of core UK telecom networks to mitigate the risk of Chinese government access. If you think that a foreign Government have gained access to the UK’s telecom networks, using an end-to-end encrypted service is one of the best ways to protect yourself, which, I am sure, is on the minds of the technical staff of UK political parties when they choose to put their teams on encrypted apps such as WhatsApp.
Thirdly, there is a general trend in privacy expectations and legislation, which are all heading in one direction: improving transparency over what is being done with data and giving people more power to withhold or grant consent. This reflects the fact that more of our lives are moving online, so being able to control it becomes more critical to us all. We see this trend playing out in multiple pieces of legislation, such as the general data protection regulation and the privacy regulation, as well as in actions taken by regulators to step up enforcement.
Far from being an irrational move by platforms careless as to its negative impacts, the adoption of end-to-end encryption is an entirely rational response to these three powerful regulatory and societal trends. It can help to mitigate the ever-increasing risks related to content liability—which the Bill, in fact, adds to—it makes hostile government surveillance much harder, and it is a key safeguard against privacy violations.
If this is where we have been with regulation incentivising the adoption of end-to-end encryption, how might this play out as we introduce a new element in the mix with the Online Safety Bill? I can see three scenarios that could play out as the Bill comes into force and Ofcom gains powers to issue directions to platforms. First, the Government could declare that their intent is to impose technical requirements that would mean that people in the UK will no longer be able to use truly secure end-to-end encrypted products. That could be either through explicit instructions to messaging service providers to remove the end-to-end encryption, or through requiring client-side scanning to be installed on user devices in the UK, which would, in effect, render them less secure. That is not my preferred option, but it would at least allow for an orderly transition, if services choose to withdraw products from the UK market rather than operate here on these terms. It might be that there are no significant withdrawals, and the UK Government could congratulate themselves on calling the companies’ bluff and getting what they want at little cost, but I doubt that this would be the case, given the strength of feeling out there—which, I am sure, we have all seen. We would at least want to know, one way or the other, which way that will go before adopting that course of action.
The second course is for the Government to continue with the posture of intentional ambiguity, as they have done to date. They are careful to say that they have no intention of banning end-to-end encryption, and I expect to hear that again in the Minister’s response today, but at the same time refuse to confirm that they could not do so under the new powers in the Bill. This creates a high-stakes game of chicken, where the Government think companies will give them more if they hold the threat of drastic technical orders over them. That “more” might include providing more metadata—who messaged whom, when and from where—or tools to identify patterns of suspicious behaviour without reading message content. These are all things we can expect the Government to be discussing with companies, as well as urging them to deploy forms of client-side scanning voluntarily.
As a veteran of a thousand psychic wars of this kind, I have to say that I do not think it is as productive a way to proceed as some in government may believe. It is all too common to have meetings with government representatives where you are working together on responding to terrorist content only to find a Minister going out the next day to say that your platform does not care about terrorism. I get it; this is politics. However, it is hard to explain to engineers who you are asking to go the extra mile to build new safety tools why they should do so when the Government who asked for the tools give them no credit for this. I understand the appeal from the government side of going into a negotiation with a big regulatory stick that you can show to the other side, but I think it is misguided.
The Government’s hope is that companies will blink first in the game of chicken and give them what they want, but it is at least as likely that the Government will blink first and have to abandon proposals, which risks discrediting their efforts as a whole. If nobody blinks, and we allow an unstoppable force to hit an immovable object, we could end up with the complete breakdown of key relationships and years of unproductive litigation. I believe that the interests of people in the UK lie in government being able to work with the services that millions of us use to find the best ways to combat harms—harms that everybody, on both sides, agree are a priority.
That brings me to my third and final scenario, and the one that these amendments are seeking to create. This is where the Government accept that end-to-end encrypted communication services are a legitimate part of the modern online environment that should not be undermined or pushed out of the UK market. The Government would explicitly rule out any intention to use orders under Clause 110 to weaken end-to-end encrypted services and instead focus their efforts on making it clear to people that end-to-end encryption does not mean impunity.
I was talking to my children as I came in about the fact that end-to-end encryption is not entirely secure and does not grant absolute privacy, and they said, “Of course—everyone should do the online safety classes we do at school”. These offer the simple message that it is foolish to send things over any internet service that you would not want to be shared widely, and the training tells you that any message can be screenshotted and passed around. Rather than talking up the fact that end-to-end encryption is protecting people sharing bad content, we should be talking up the ways in which you remain exposed.
Sadly, we have become used to reading stories about awful content being shared in groups on messaging services used by serving police officers—these were WhatsApp end-to-end encrypted messages. If there is legitimate interest in investigating content, we will see it serviced, whether or not it is shared on an encrypted service. Unless people are communicating only with themselves, there are multiple ways that their content, if illegal, might come to the attention of the authorities. The most obvious is that someone who is privy to the content hands it over, either voluntarily or because they are themselves under investigation. But the police and security services also have a range of intrusive surveillance tools at their disposal which can compromise the devices of their targets under properly warranted authority, and all the content on any apps they use can be provided to the security services properly, under the controls in the Regulation of Investigatory Powers Act. There are long-standing powers, sometimes used controversially, to require people to grant access to their own devices if there are grounds to think it is necessary to investigate some types of offence.
I hope the Government will give serious consideration to moving in this direction and to accepting the force of the amendments that have been put forward today. This is not about weakening the fight against appalling harms such as child sexual abuse material and terrorism, but rather about finding the most practical way to wage that fight in a world where end-to-end encryption exists and is being used to mitigate other material risks to our online lives. I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendment 190 in the name of the noble Lord, Lord Clement-Jones, and Amendment 285 in the name of the noble Lord, Lord Stevenson. That is not to say that I do not have a great deal of sympathy for the incredibly detailed and expert speech we have just heard, but I want to say just a couple of things.

First, I think we need to have a new conversation about privacy in general. The privacy that is imagined by one community is between the state and the individual, and the privacy that we do not have is between individuals and the commercial companies. We live in a 3D world and the argument remains 2D. We cannot do that today, but I agree with the noble Lord that many in the enforcement community do have one hand on human rights, and many in the tech world do care about human rights. However, I do not believe that the tech sector has fully fessed up to its role and the contribution it could make around privacy. I hope that, as part of the debate on the Bill, and the debate that we will have subsequently on the data Bill No. 2, we come to untangle some of the things that they defend—in my view, unnecessarily and unfairly.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.

Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.

I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - - - Excerpts

My Lords, my name is attached to Amendment 203 in this group, along with those of the noble Lords, Lord Clement-Jones, Lord Strathcarron and Lord Moylan. I shall speak in general terms about the nature of the group, because it is most usefully addressed through the fundamental issues that arise. I sincerely thank the noble Lord, Lord Allan, for his careful and comprehensive introduction to the group, which gave us a strong foundation. I have crossed out large amounts of what I had written down and will try not to repeat, but rather pick up some points and angles that I think need to be raised.

As was alluded to by the noble Baroness, Lady Kidron, this debate and the range of these amendments shows that the Bill is currently extremely deficient and unclear in this area. It falls to this Committee to get some clarity and cut-through to see where we could end up and change where we are now.

I start by referring to a briefing, which I am sure many noble Lords have received, from a wide range of organisations, including Liberty, Big Brother Watch, the Open Rights Group, Article 19, the Electronic Frontier Foundation, Reset and Fair Vote. It is quite a range of organisations but very much in the human rights space, particularly the digital human rights space. The introduction of the briefing includes a sentence that gets to the heart of why many of us have received so many emails about this element of the Bill:

“None of us want to feel as though someone is looking over our shoulder when we are communicating”.

13:00
I take the point made by the noble Baroness, Lady Kidron, that many of our communications are scanned and this has an impact, but as the noble Lord, Lord Allan, said, end-to-end encryption prevents this. There is an increasing public awareness and understanding about that, and a desire to get away from the big tech companies the public utilises and clearly wishes to continue to utilise. That is a general public view, and one of the points made in the briefing is that so many people have very good reason to desire to maintain their privacy and be able to express themselves freely. The briefing notes that LGBTQIA+ people, for example, may wish that individual communications remain private.
I want to focus mostly on the broader issue of people who are looking to use services for public good. Some 40 million people in the UK use private messaging services every day, but some of those are journalists and activists for democracy and human rights around the world who are potentially putting themselves, and those with whom they communicate, in danger from repressive regimes. We have the problem that, if we open up the encryption, it can then be used by all kinds of different actors. It is worth putting on the record that the National Union of Journalists—I declare my position as a former newspaper editor—has expressed grave concerns about the duties currently being put on breaking encryption. It notes that it places
“journalists, sources and whistle-blowers in danger, creating a chilling effect that prevents individuals providing information that could help inform public interest journalism, and hold the powerful to account”.
I regret that I cannot be in the Committee on the economic crime Bill running in parallel to this, where we are talking about some of these issues. On the last group in that Bill, we talked about the importance of the media and NGOs in exposing economic crime, and that is true of many other areas of our society.
The noble Lord, Lord Allan, stressed what we might see if organisations choose to withdraw from the UK rather than leave their services here, but I want to address the point about what could happen if the organisations remain here and allow the set-up of systems for client-side scanning, as this Bill appears to point towards. That would open up the tools to being available and we know from experience around the world that, once we have those tech approaches out there, they spread literally in the manner of a virus—both in the biological and technical sense. They are available then to a whole lot of actors whom we do not want to have them.
It is worth coming back to the overall view of this. Sometimes we say that the security services were always able to open letters and look at individual communications, when we hope they had the legal basis to do so. Here we are talking about everything, everybody, all the time, which is an entirely different world situation to that individual, targeted legal basis.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to take advantage of the noble Baroness having raised that point to say that perhaps I was not clear enough in my speech. While I absolutely agree about not everything, everybody, all the time, for my specific concerns around child sexual abuse, abuse of women and so on, we have to find new world order ways of creating targeted approaches so it does not have to be everything, everybody, all the time.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I am glad I gave the noble Baroness the opportunity for that intervention. I have a reasonable level of technical knowledge—I hand-coded my first website in 1999, so I go back some way—but given the structures we are dealing with, I question the capacity and whether it is possible to create the tools and say they will be used only in a certain way. If you break the door open, anyone can walk through the door—that is the situation we are in.

As the noble Lord, Lord Allan, said, this is a crucial part of the Bill that was not properly examined and worked through in the other place. I will conclude by saying that it is vital we have a full and proper debate in this area. I hope the Minister can reassure us that he and the department will continue to be in dialogue with noble Lords as the Bill goes forward.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.

Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.

That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.

We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.

End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.

Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.

Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.

We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.

The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.

Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.

I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.

13:15
I noted that Professor Ross Anderson of the University of Cambridge said that we cannot expect artificial intelligence to replace police officers, teachers, and social workers in child protection. He said:
“The idea that complex social problems are amenable to cheap technical solutions is the siren song of the software salesman and has lured many a gullible government department on to the rocks”.
This is true. Most child abuse happens offline. Online child abuse needs to be dealt with, but I worry that we will say to people, “Don’t worry, all will be well because we’re dealing with it in the Online Safety Bill”.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

No one in the Committee or anyone standing behind us who speaks up for children thinks that this is going to be a silver bullet. It is unacceptable to suggest that we take that position. Much child abuse takes place offline and is then put online, but the exponential way in which it is consumed, created, and spread is entirely new because of the services we are talking about. Later in Committee I will explain some of the new ways in which it is creating child abuse—new forms, new technologies, new abuse.

I am sorry to interrupt the noble Baroness. I have made my feelings clear that I am not an end-to-end encryption “breaker”. There are amendments covering this; I believe some of them will come up later in the name of the noble Lord, Lord Russell, on safety by design and so on. I also agree with the noble Baroness that we need more resources in this area for the police, teachers, social workers and so on. However, I do not want child sexual abuse to be a football in this conversation.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.

Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.

Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.

My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.

As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.

I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice

“to a regulated service which offers private messaging with end-to-end encryption”;

and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.

Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.

Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.

If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.

I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I would like to say something very quickly on VPN. I had a discussion with some teenagers recently, who were all prepared for this Bill—I was quite surprised that they knew a lot about it. They said, “Don’t worry, we’ve worked out how to get around it. Have you heard of VPN?” It reminded me of a visit to China, where I asked a group of students how they dealt with censorship and not being able to google. They said, “Don’t worry about it”, and showed me VPN. It is right that we draw attention to that. There is a danger of inadvertently forcing people on to the unregulated dark web and into areas that we might not imagine. That is why we have to be careful and proportionate in our response.

13:30
Lord McNally Portrait Lord McNally (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I have long been on record as being for radical reform of the House of Lords, but I do not think there are many Chambers in the world that could have had such an interesting debate on such a key subject—certainly not the House of Commons, sadly. Without falling into the old trap of saying what a wonderful lot we all are, it is important that, in such an important Bill, covering so many important areas of civil liberties and national security, there should be an opportunity, before we get to voting, to have this kind of debate and get some of the issues into the public domain.

I am on the same side as the noble Baroness, Lady Fox, on knowledge of the technology—looking back to 20 years ago, when I was on the committee that worked on the communications Bill which set up Ofcom, I see that we were genuinely innocents abroad. We deliberately decided not to try regulating the internet, because we did not know what it was going to do. I do not think that we can have that excuse today.

Perhaps an even more frightening background is that, for three and a half years, during the coalition Government, I was Minister for Digital Protection—a less equipped Minister to protect your digital I cannot imagine. However, I remember being taken to some place over the river to have a look at our capacities in this area. Having seen some of the things that were being done, I rather timidly asked the expert who was showing me round, “Aren’t there civil liberty issues in what you’re doing?” He said, “Oh no, sir. Tesco know far more about you than we do”.

There is this element about what is secret. The noble Baroness, Lady Fox, in her last contribution, said that children look with contempt at some of the safeguards and blockages that keep them away from things. I do not think anybody is deluding themselves that there is some silver bullet. As always, Parliament must do its best to address real national concerns and real problems in the best way that we see at this time. There is a degree of cross-party and Cross-Bench unity, in that there are real and present dangers in how these technologies are being used, and real and present abuses of a quite horrific kind. The noble Baroness, Lady Kidron, is right. This technology has given a quantum leap to the damage that the abuser and the pornographer can do to our society, in the same way that it has given a quantum leap to those who want to undermine the truth and fairness of our election system. There are real problems that must be addressed.

Although it has not been present in this debate, it is no help to polarise the argument as being between the state wanting to accrue more and more powers and brave defenders of civil liberties. As somebody who has practised some of these dark arts myself, I advise those who are organising letters to ensure that those sending them do not leave in the paragraph that says, “Here you may want to include some personal comments”. It waters down the credibility of this as some independent exercising of a democratic right.

I make a plea, as someone on the edges of the debate who at times had some direct responsibilities, to use what the Bill has thrown up to address whether it is now in the right shape—I hope the Minister hears it. The Government should not be ashamed to take it away and think a bit. It may be that we can add some of the protections that we quite often do, such as allowing certain interventions after a judge or senior police officer or others have been involved. That may already be in other parts of the Bill. However, it would be wrong to allow the Bill to polarise this, given that there was no one who spoke this morning who is not trying to deal with very real difficulties, problems and challenges, within the framework of a democratic society, in a way that protects our freedoms but also protects us from real and present dangers.

Lord Kamall Portrait Lord Kamall (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is the first time that I have spoken on the Bill in Committee. I know noble Lords are keen to move on and get through the groups as quickly as possible, but I hope they will forgive me if I say that I will speak only about twice on the Bill, and this is one of the groups that I want to speak to. I will try not to make your Lordships impatient.

I should tell the Committee a little about where I am coming from. I was very geeky as a kid. I learned to program and code. I did engineering at university and coded there. My master’s degree in the late 1980s was about technology and policy, so I have been interested in technology policy since then, having followed it through in my professional life. In 1996, I wrote a book on EU telecoms—it sold so well that no one has ever heard of it. One thing I said in that book, which though not an original thought is pertinent today, is that the regulation will always be behind the technology. We will always play catch-up, and we must be concerned about that.

Interestingly, when you look at studies of technology adoption—pioneers, early adopters and then the rest of the population—quite often you see that the adult industry is at the leading edge, such as with cable TV, satellite TV, video cassettes, online conferencing, et cetera. I assure your Lordships that I have not done too much primary research into this, but it is an issue that we ought to be aware of.

I will not speak often in this debate, because there are many issues that I do not want to disagree on. For example, I have already had a conversation with the noble Baroness, Lady Kidron, and we all agree that we need to protect children. We also know that we need to protect vulnerable adults; there is no disagreement on that. However, in these discussions there will be inevitable trade-offs between security and safety and freedom. It is right to have these conversations to ensure that we get the balance right, with the wisdom of noble Lords. Sacrifices will be made on either side of the debate, and we should be very careful as we navigate this.

I am worried about some of the consequences for freedom of expression. When I was head of a research think tank, one of the phenomena that I became interested in was that of unintended consequences. Well-meaning laws and measures have often led to unintended consequences. Some people call it a law of unintended consequences, and some call it a principle, and we should be careful about this. The other issue is subjectivity of harms. Given that we have taken “legal but harmful” out and there are amendments to the Bill to tackle harms, there will be a debate on the subjectivity of harms.

One reason I wanted to speak on this group is that some of the amendments tabled by noble Lords—too many to mention—deal with technology notices and ensuring that we are consistent between the offline and online worlds, particularly regarding the Regulation of Investigatory Powers Act. I welcome and support those amendments.

We also have to be aware that people will find a way around it, as the noble Baroness, Lady Fox, said. When I was looking at terrorism and technology, one of the issues that people raised with me was not to forget that one way around it was to create an email account and store stuff in a draft folder. You could then share the username and password with others who could then access that data, those pictures or those instructions in a draft folder. The noble Lord, Lord Allan, has gone some way to addressing that issue.

The other issue that we have to be clear about is how the tech sector can do more. It was interesting when my noble friend Lady Stowell organised a meeting with Meta, which was challenged particularly on having access to information and pictures from coroners. It was very interesting when Meta told us what it could access: it does not know what is in the messages, but there are things that it can access, or advise people to access, on the user’s phone or at the other end. I am not sure whether the noble Baroness, Lady Kidron, has had the conversation with Meta, but it would be helpful and important to find some common ground there, and to probe and push Meta and others to make sure that they share that information more quickly, so we do not have to wait five years to get it via the coroner or whatever. We ought to push that as much as possible.

I want to talk in particular about unintended consequences, particularly around end-to-end encryption. Even if you do not believe the big businesses and think that they are crying wolf when they say that they will quit the UK—although I believe that there is a threat of that, particularly when we continually want the UK to be a global hub for technology and innovation and so cannot afford for companies such as Meta, Signal and others to leave—you should listen to the journalists who are working with people, quite often dissidents, in many countries, and rely on encrypted communications to communicate with them.

The other risk we should be aware of is that it is very difficult to keep technology to a few people. In my academic career, I also looked at technology transfer, both intentional and unintentional. We should look at the intelligence services and some of the innovations that happened: for example, when Concorde was designed, it was not very long after that the Soviets got their hands on that equipment. Just as there used to be a chap called Bob in the exchange who could share information, there is always a weak spot in chains: the humans. Lots of humans have a price and can be bought, or they can be threatened, and things can be shared. The unintended consequence I am worried about is that this technology will get into the hands of totalitarian regimes. At the same time, it means people over here who are really trying desperately to help dissidents and others speak up for freedom in other countries will be unable to support them. We should be very careful and think about unintended consequences. For that reason, I support this group of amendments.

I really am looking forward to the responses from the Minister. I know that the noble Lord, Lord McNally, said that he was a Minister for three years on data protection; I was a Minister in this department for one month. I was so pleased that I had my dream job, as Minister for Civil Society and Heritage, and so proud of my party and this country because we had elected the first Asian Prime Minister; then, six days later, I got sacked. So, as they say, be careful what you wish for.

In this particular case, I am grateful to the noble Lords who have spoken up in this debate. I do not want to repeat any other points but just wanted to add that. I will not speak often, but I want to say that it is really critical that, when we look at this trade-off between security, safety and freedom, we get it right. One way of doing that is to make sure that, on technology notices and RIPA, we are consistent between the online and offline worlds.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it has been a very good debate indeed. When I first saw this grouping, my heart sank: the idea that we should be able to encompass all that within the space of just over an hour seemed a bit beyond all of us, however skilled and experienced we were, and whatever background we were able to bring to the debate today. I agree with both noble Lords who observed that we have an expertise around here that is very unusual and extremely helpful in trying to drill down into some of these issues.

The good thing that has come out from this debate, which was summed up very well by the noble Lord, Lord Kamall, is that we are now beginning to address some of the underlying currents that the Bill as a boat is resting on—and the boat is a bit shaky. We have a very strong technological bias, and we are grateful for the masterclass from the noble Lord, Lord Allan of Hallam, on what is actually going on in the world that we are trying to legislate for. It leaves me absolutely terrified that we are in a situation where we appear to be trying to future-proof, possibly in the wrong direction. We should be very careful about that. We will want to reflect on the point he made on where the technology is driving this particular aspect of our social media and search engine operations.

13:45
The Bill is very wide ranging and, therefore, the amendments must necessarily follow it. But, in this group, we seem to be doing three things: we are trying to recognise whether there is a problem with encrypted messaging, and its relationship to security on the one hand and privacy and human rights on the other. I am very pleased that we are doing this, but I am not quite sure that we are in a position to make long-lasting conclusions. Like everybody else, I think that the burden falls on the Minister to convince us that he has reached the right place in the consideration of this and that his proposals will be right for the present day, let alone the future.
The noble Baroness, Lady Stowell, was right: we need to be very clear what the Government are trying to do here. I am afraid that I am not convinced that I know what it is. I put it to the Minister that he should make it very evident up front. This section of the Bill, and the way that we have been grouped into discussing it—because there are other things that we will need to come back to that relate to it—will need to be convincing. At the moment, I do not think that it is.
I say that because, if you go down where the Bill is trying to get to, it is very odd indeed that Ofcom has the powers to look at the messaging of private individuals and that the same body is also regulating. In other words, Ofcom is expected to be both gamekeeper and poacher. The points made around the Chamber on this issue are unanswerable. In the offline world, we have a structure that works through RIPA, which seems an exemplary model. I have heard the Minister say in private meetings that the procedures which will be in place in Ofcom will replicate that in every way and that there should be no concern about it, but the problem is the fact that it is the same body that is doing it. Enough has been said to make a very good case that at the very least, if we go ahead on the basis of what the Bill says, the decisions on whether or not the technologies can begin to peek into the encrypted world need to be authorised by an external body at a judicial level, and that it should follow the RIPA model, which has stood the test of time and seems to work very well and to everyone’s satisfaction. That is my first point.
My second point is that if we are to go down a technological route, we have to be certain that it is necessary; I worry that it is in advance of where we perhaps need to go, and that having a bit more time before it comes into place might be a way forward. I think we have heard enough from those who have written in and in the meetings we have had that this does not seem to be a hotspot for the police, who will have responsibility for doing quite a lot of the legwork on this. They seem to have powers which they could use to get to where they need to be in order to make sure that the crimes being commissioned or committed can be investigated and that those responsible are brought to justice. If that is the case, why are we putting in this extra step? Again, I do not have the confidence that the Bill is going in the right direction here.
We can add to that some of the technological issues, which are as important. If we have a technology capable of carrying out the inquisition of encrypted material in a way which will be satisfactory as defined by the legislation, is there not a risk that we are simply opening up the whole process to hackers and those who might be able to do more harm than good? One representation we had said that the requirement under Clause 110 to use accredited technology to identify CSEA and/or terrorism content, whether, in the words of the Bill,
“communicated publicly or privately by means of the service”,
means that a currently secure platform check and a scan of users will be opened up. That proposal imposes the decryption of something that is encrypted, which cannot be right. That would open up too much of a risk for those who are, as we have heard, in many ways and in many parts of the world dependent on encryption to carry on doing the things that we want them to do. The ability to hijack this type of technology is a worry which I have not seen reflected in any of the discussions we have had with the Government on this point.
Finally, I know this is unpopular as far as the Government are concerned, but is there not a concern that we are running a coach and horses through some of our well thought-through and important issues relating to human rights? The EHRC’s paper says that the provisions in Clause 110 may be disproportionate and an infringement of millions of individuals’ rights to privacy where those individuals are not suspected of any wrongdoing. This is not a right or wrong issue; it is a proportion issue. We need to balance that. I do not know if have heard the Minister set out exactly why the measures in the Bill meet that set of conditions, so I would be grateful if he could talk about that or, if not, write to us. If we are in danger of heading into issues which are raised by Article 8 of the ECHR—I know the noble Lord opposite may not be a huge supporter of it, but it is an important part of our current law, and senior Ministers have said how important it will be in the future—surely we must have safeguards which will protect it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.

The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.

Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.

With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.

In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.

The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.

Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.

The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.

The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.

The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.

More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.

I turn now to Amendments 14—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.

14:00
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The noble Baroness’s intervention has given me an opportunity to note that I am about to say a little more on best endeavours, which will not fully answer the question from the noble Lord, Lord Knight, but I hope fleshes it out a little more.

I do that in turning to Amendments 14, 108 and 205, which seek to clarify that companies will not be required to undertake fundamental changes to the nature of their service, such as the removal or weakening of end-to-end encryption. As I previously set out, the Bill does not require companies to weaken or remove any design and there is no requirement for them to do so as part of their risk assessments or in response to a notice. Instead, companies will need to undertake risk assessments, including consideration of risks arising from the design of their services, before taking proportionate steps to mitigate and manage these risks. Where relevant, assessing the risks arising from end-to-end encryption will be an integral part of this process.

This risk management approach is well established in almost every other industry and it is right that we expect technology companies to take user safety into account when designing their products and services. We understand that technologies used to identify child sexual abuse and exploitation content, including on private communications, are in some cases nascent and complex. They continue to evolve, as I have said. That is why Ofcom has the power through the Bill to issue a notice requiring a company to make best endeavours to develop or source technology.

This notice will include clear, proportionate and enforceable steps that the company must take, based on the relevant information of the specific case. Before issuing a warning notice, Ofcom is expected to enter into informal consultation with the company and/or to exercise information-gathering powers to determine whether a notice is necessary and proportionate. This consultation period will assist in establishing what a notice to develop a technology may require and appropriate steps for the company to take to achieve best endeavours. That dialogue with Ofcom is part of the process.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

There are a lot of phrases here—best endeavour, proportionate, appropriate steps—that are rather subjective. The concern of a number of noble Lords is that we want to address this issue but it is a matter of how it is applied. That is one of the reasons why noble Lords were asking for some input from the legal profession, a judge or otherwise, to make those judgments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

All the phrases used in the Bill are subject to the usual scrutiny through the judicial process—that is why we debate them now and think about their implications—but of course they can, and I am sure will, be tested in the usual legal ways. Once a company has developed a new technology that meets minimum standards of accuracy, Ofcom may require its use but not before considering matters including the impact on user privacy, as I have set out. The Bill does not specify which tools are likely to be required, as we cannot pre-empt Ofcom’s evidence-based and case-by-case assessment.

Amendment 285 intends to clarify that social media platforms will not be required to undertake general monitoring of the activity of their users. I agree that the protection of privacy is of utmost importance. I want to reassure noble Lords, in particular my noble friend Lady Stowell of Beeston, who asked about it, that the Bill does not require general monitoring of all content. The clear and strong safeguards for privacy will ensure that users’ rights are protected.

Setting out clear and specific safeguards will be more effective in protecting users’ privacy than adopting the approach set out in Amendment 285. Ofcom must consider a number of matters, including privacy, before it can require the use of proactive technology. The government amendments in this group, Amendments 290A to 290G, further clarify that technology which identifies words, phrases or images that indicate harm is subject to all of these restrictions. General monitoring is not a clearly defined concept—a point made just now by my noble friend Lord Kamall. It is used in EU law but is not defined clearly in that, and it is not a concept in UK law. This lack of clarity could create uncertainty that some technology companies might attempt to exploit in order to avoid taking necessary and proportionate steps to protect their users. That is why we resist Amendment 285.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I understand the point the Minister is making, but it is absolutely crystal clear that, whatever phrase is used, the sensibility is quite clear that the Government are saying on record, at the Dispatch Box, that the Bill can in no way be read as requiring anybody to provide a view into private messaging or encrypted messaging unless there is good legal cause to suspect criminality. That is a point that the noble Baroness, Lady Stowell, made very clearly. One may not like the phrasing used in other legislatures, but could we find a form of words that will make it clear that those who are operating in this legal territory are absolutely certain about where they stand on that?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, I want to give clear reassurance that the Bill does not require general monitoring of all content. We have clear and strong safeguards for privacy in the Bill to ensure that users’ rights are protected. I set out the concerns about use of the phrase “general monitoring”. I hope that provides clarity, but I may have missed the noble Lord’s point. The brief answer to the question I think he was asking is yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Let the record stand clear: yes. It was the slight equivocation around how the Minister approached and left that point that I was worried about, and that people might seek to use that later. Words from the Dispatch Box are never absolute and they are never meant to be, but the fact that they have been said is important. I am sure that everybody understands that point, and the Minister did say “yes” to my question.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I did, and I am happy to say it again: yes.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.

Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.

The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.

I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.

Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.

Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.

Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Is that right? I do not need a yes or no answer. It was rhetorical; I am just trying to frame the right question. The Minister is making a very strong point about the difference between RIPA requirements and those that might be brought in under this Bill. But it does not really get to the bottom of the questions we were asking. In this situation, whatever the exact analogy between the two systems is, it is clear that Ofcom is marking its own homework—which is fair enough, as there are representations, but it is not getting external advice or seeking judicial approval.

The Minister’s point was that that was okay because it was private companies involved. But we are saying here that these would be criminal offences taking place and therefore there is bound to be interest from the police and other agencies, including anti-terrorism agencies. It is clearly similar to the RIPA arrangements, so he could he just revisit that?

14:15
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.

Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.

I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.

Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.

Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.

To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.

The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.

Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.

As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.

Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.

Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.

Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.

Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.

I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.

I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.

The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.

The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.

The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.

The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.

The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.

The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.

14:30
I shall search out the book written by the noble Lord, Lord Kamall, but he was right to talk about unintended consequences. Critically, we are in a world of known unknowns here: we know that there will be an issue when the technical notices are issued, but we do not have the technical notices, so it is really hard for us to understand how far they will be a problem.
The noble Lord, Lord Stevenson, talked about the human rights aspect. Again, that is critical. How do we know whether the powers are proportionate if we do not know what Ofcom is going to tell companies to do? That is the problem. To his credit, the Minister tried to respond and gave some more clarity. There was some in there—and people out there will pore over this like a sacred text to try to understand what was said—but what I heard was, “If you’re already offering an end-to-end encrypted service, we’re not going to tell you to get rid of it, but if your service isn’t currently end-to-end encrypted, we may”. I heard the words “if you are deliberately blinding yourself to the bad content”. That sounds to me like, “Don’t start encrypting if you’re not already encrypted”. If that is the Government’s intention, it may be reasonable, but we will need to tease it out further. It is quite a big deal. Looking forward, we have to ask whether, if end-to-end encrypted services did not exist today and were coming on to the market, Ofcom would try to use the powers to stop them coming on to the market or whether they would be relaxed. We still have a lot of known unknowns in this space that I am sure we will come back to.
I am conscious of the time. I am sure that there will be people out there who are looking at this debate. I remind them that, at this stage, we never vote on anything. I am sure that we will come back to this issue at later stages, where we may vote on it. I beg leave to withdraw the amendment.
Amendment 14 withdrawn.
Amendment 15 not moved.
Clause 6, as amended, agreed.
Clause 7 agreed.
House resumed.
First Reading
14:32
The Bill was brought from the Commons, read a first time and ordered to be printed.
14:33
Sitting suspended. Committee to begin again not before 3.03 pm.
Committee (3rd Day) (Continued)
15:03
Clause 8: Illegal content risk assessment duties
Amendment 16
Moved by
16: Clause 8, page 7, line 16, after “governance,” insert “terms of service,”
Member’s explanatory statement
This amendment makes clear that “design and operation of a service” includes its terms of service.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, this group of amendments concerns terms of service. All the amendments either have the phrase “terms of service” in them or imply that we wish to see more use of the phrase in the Bill, and seek to try to tidy up some of the other bits around that which have crept into the Bill.

Why are we doing that? Rather late in the day, terms of service has suddenly become a key fulcrum, under which much of the operations of the activity relating to people’s usage of social media and service functions on the internet will be expressed in relation to how they view the material coming to them. With the loss of the adult “legal but harmful” provisions, we also lost quite a considerable amount of what would have been primary legislation, which no doubt would have been backed up by codes of practice. The situation we are left with, and which we need to look at very closely, is the triple shield at the heart of the new obligations on companies, and, in particular, on their terms of service. That is set out primarily in Clauses 64, 65, 66 and 67, and is a subject to which my amendments largely refer.

Users of the services would be more confident that the Government have got their focus on terms of service right, if they actually said what should be said on the tin, as the expression goes. If it is the case that something in a terms of service was so written and implemented so that material which should be taken down was indeed taken down, these would become reliable methods of judging whether or not the service is the one people want to have, and the free market would be seen to be working to empower people to make their own decisions about what level of risk they can assume by using a service. That is a major change from the way the Bill was originally envisaged. Because this was done late, we have one or two of the matters to which I have referred already, which means that the amendments focus on changing what is currently in the Bill.

It is also true that the changes were not consulted upon; I do not recall there being any document from government about whether this was a good way forward. The changes were certainly not considered by the Joint Committee, of which several of those present were members—we did not discuss it in the Joint Committee and made no recommendation on it. The level of scrutiny we have enjoyed on the Bill has been absent in this area. The right reverend Prelate the Bishop of Oxford will speak shortly to amendments about terms of service, and we will be able to come back to it. I think it would have been appropriate had the earlier amendment in the name of the noble Lord, Lord Pickles, been in this group because the issue was the terms of service, even though it had many other elements that were important and that we did discuss.

The main focus of my speech is that the Government have not managed to link this new idea of terms of service and the responsibilities that will flow from that to the rest of the Bill. It does not seem to fit into the overall architecture. For example, it is not a design feature, and does not seem to work through in that way. This is a largely self-contained series of clauses. We are trying to ask some of the world’s largest companies, on behalf of the people who use them, to do things on an almost contractual basis. Terms of service are not a contract that you sign up to, but you certainly click something—or occasionally click it, if you remember to—by which you consent to the company operating in a particular set of ways. In a sense, that is a contract, but is it really a contract? At the heart of that contract between companies and users is whether the terms of service are well captured in the way the Bill is organised. I think there are gaps.

The Bill does have something that we welcome and want to hold on to, which is that the process under which the risks are assessed and decisions taken about how companies operate and how Ofcom relates to those decisions is about the design and operation of the service—both the design and the operation, something that the noble Baroness, Lady Kidron, is very keen to emphasise at all times. It all starts and ends with design, and the operation is a consequence of design choices. Other noble Baronesses have mentioned in the debate that small companies get it right and so, when they grow, can be confident that what they are doing is something that is worth doing. Design, and operating that design to make a service, is really important. Are terms of service part of that or are they different, and does it matter? It seems to me that they are downstream from the design: something can be designed and then have terms of service that were not really part of the original process. What is happening here?

My Amendments 16, 21, 66DA, 75 and 197 would ensure that the terms of service are included within the list of matters that constitute “design and operation” of the service at each point that it occurs. I have had to go right through the Bill to add it in certain areas—in a rather irritating way, I am sure, for the Bill team—because sometimes we find that what I think should be a term of service is actually described as something else, such as a “a publicly available statement”, whatever that is. It would be an advantage if we went through it again and defined terms of service and made sure that that was what we were talking about.

Amendments 70 to 72, 79 to 81 and 174 seek to help the Government and their officials with tidying up the drafting, which probably has not been scrutinised enough to pick up these issues. It may not matter, at the end of the day, but what is in the Bill is going to be law and we may as well try to get it right as best we can. I am sure the Minister will say we really do not need to worry about this because it is all about risks and outcomes, and if a company does not protect children or has illegal content, or the user-empowerment duties—the toggling—do not work, Ofcom will find a way of driving the company to sort it out. What does that mean in practice? Does it mean that Ofcom has a role in defining what terms of service are? It is not in the Bill and may not reach the Bill, but it is something that will be a bit of problem if we do not resolve what we mean by it, even if it is not by changing the legislation.

If the Minister were to disagree with my approach, it would be quite nice to have it said at the Dispatch Box so that we can look at that. The key question is: are terms of service an integral part of the design and operation of a service and, if so, can we extend the term to make sure that all aspects of the services people consume are covered by adequate and effective terms of service? There is probably going to be division in the way we approach this because, clearly, whether they are terms of service or have another name, the actual enforcement of illegal and children’s duties will be effected by Ofcom, irrespective of the wording of the Bill—I do not want to question that. However, there is obviously an overlap into questions about adults and others who are affected by the terms of service. If you cannot identify what the terms of service say in relation to something you might not wish to receive because the terms of service are imprecise, how on earth are you going to operate the services, the toggles and things, around it? If you look at that and accept there will be pressure within the market to get these terms of service right, there will be a lot of dialogue with Ofcom. I accept that all that will happen, but it would be good if the position of the terms of service was clarified in the Bill before it becomes law and that Ofcom’s powers in relation to those are clarified—do they or do they not have the chance to review terms of service if they turn out to be ineffective in practice? If that is the case, how are we going to see this work out in practice in terms of what people will be able to do about it, either through redress or by taking the issue to court? I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.

First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.

15:15
Secondly, to back up the point made by the noble Baroness, Lady Kidron, we need to be realistic that no one will ever read all the terms of service of the services that they use. There was a study that looked at how long it would take to read the terms of service on a typical mobile phone—I think it is around 10 days; given that they get updated most years, are any of us going to spend 10 days a year reading the terms of service?
We like our real-world analogues: we read all of the law, but none of the people out there read all of the laws of the land unless and until they have a problem, at which point they do read them. Terms of service are very similar in that people are not going to read them and we should not expect people to read them unless and until they have a problem that requires them to do so. I do not mean that as a counsel of despair, but we have to be realistic about what we are expecting people to do.
Thirdly, the Bill is going to make terms of service longer, and we need to get over that. The challenge is always that you want your terms of service to be comprehensive and easy for users, and as we move in the Bill towards making terms of service more actionable—which we are doing because the Bill says that Ofcom will be able to say, “Did you apply your terms of service properly?”—the lawyers for the platforms are going to be saying “What have we missed out?” and “If there is anything we have missed out, we have to go and stick it in there because now we are going to have a regulator breathing down our neck, checking whether or not we have done what we say”.
We should be realistic that we are asking companies to be entirely comprehensive and transparent, and in general that will mean making their terms of service longer. Again, this is not a complete counsel of despair. We can follow Mark Twain’s advice:
“I didn’t have time to write you a short letter, so I wrote you a long one”
and invest the time. That is what we can do to try to make terms of service shorter, rather than just saying to lawyers, “We will pay you by the word, and the more words there are, the happier we are”. But, again, we should be realistic: if it is comprehensive, it is going to be long; there is no way to avoid that.
The noble Lord, Lord Stevenson, asked whether or not it is a contract—that is an interesting question, certainly for the US providers. In the US the regulation, such that there is, is done largely by the Federal Trade Commission, and the concept is whether or not services are engaged in unfair or deceptive practices. An unfair or deceptive practice is not doing what you said you would do in the terms of service or a critical document of that nature.
Interestingly, all the incentive in the US is to be as vague as possible because if you have not said that you will do things, you cannot be hauled in front of the FTC. The EU generally creatives incentives to be as comprehensive as possible, and I was involved in a number of cases where the company I worked for was taken to court and forced to add in more text because the US text was seen as too skeletal—that is a familiar debate to us here, whether we like things to be skeletal or for everything to be filled in.
So we need to be cognisant of that as we build terms of service into the Bill. This is not an argument against the amendments, but rather to say that as we do this, we need to be clear that we may be pulling in opposite directions. They need to be comprehensive, yet easy to use. “We are going to hold you accountable in the US; therefore, you should be vague; but we are also going to hold you accountable in the UK if you are too vague”—where is the right point of specificity and vagueness?
Having said that, it is really important that we focus on this because from a user’s point of view you are far more likely to come across an issue with the terms than an issue with the law—this is great, because most people in this country are law-abiding and not seeking to break the law.
The final point is that sometimes there is a tendency to think that everyone should have uniform terms of service. I can see the argument for a baseline, but in a vibrant market there is a strong case to say that we should celebrate where they are different, and there are communities that are different. For example, if you have a service that targets young people you might want to prohibit swearing; whereas, for example, it would be completely inappropriate to prohibit swearing in a vibrant political community for adults only. There are lots of areas where people understand that the context is different. For example, there are places where nudity—not pornography—is okay, and places where it is not.
So having different terms of service for different types of service is healthy, but I also think that Ofcom making sure that people do what they say they do is a reasonably healthy development, as long as we recognise and accept the consequences of that.
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.

One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1 services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.

The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.

Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.

The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.

I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.

In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.

Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider, it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.

However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.

Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.

User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.

The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.

The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to the Minister for that very detailed response, which I will have to read very carefully because it was quite complicated. That is the answer to my question. Terms of service will not be very easy to identify because to answer my questions he has had to pray in aid issues that Ofcom will necessarily have to assess—terms of services—to get at whether the companies are performing the duties that the Bill requires of them.

I will not go further on that. We know that there will be enough there to answer the main questions I had about this. I take the point about search being distinctively different in this area, although a tidy mind like mine likes to see all these things in one place and understand all the words. Every time I see “publicly available statement”, I do not know why but I think about people being hanged in public rather than a term of service or a contract.

15:30
The noble Lord, Lord Allan, made the point that nobody ever reads these terms of service. We generally agree with that, but if you are married to a lawyer, as I am, you read an awful lot more of these things than you perhaps feel are good for your diet. I cannot even go on holiday until I have proven to her that I have read every word of my insurance policy on what I will be shipped home with. It is a frightening thought that some people do that because they like doing it, and she does.
I will not take this much further. The jibe that I had at the beginning—that this does not quite fit with the rest of the Bill—is still there, but we will not get much change out of what we are doing. The important thing is that, even though it is a rather complicated route, it looks as though Ofcom will have, possibly retrospectively and with more transparency than actual powers, the ability to look at terms of service when they are not working.
What I miss is the ability to set a standard for terms of service that is broadly acceptable to people, which was exactly the point that the noble Lord made: they cannot be so complex that you will not read them but they have to be sufficient to achieve what they do. I am still lost about what you can use the triple shield for if you do not know whether the services will deliver what you know you do not want. I beg leave to withdraw the amendment.
Amendment 16 withdrawn.
Amendment 16A
Moved by
16A: Clause 8, page 7, line 23, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
Amendment 16A agreed.
Clause 8, as amended, agreed.
Clause 9: Safety duties about illegal content
Amendments 16B and 16C
Moved by
16B: Clause 9, page 7, line 27, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
16C: Clause 9, page 7, line 27, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise illegal content risk assessments in the terms of service (see the amendment inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
Amendments 16B and 16C agreed.
Amendment 17
Moved by
17: Clause 9, page 7, line 30, leave out “prevent individuals from” and insert “protect individuals from harms arising due to them”
Member’s explanatory statement
This amendment, along with the other amendment to Clause 9 in the name of Lord Moylan, adds a requirement to protect individuals from harm, rather than monitoring, prior restraint and/or denial of access. Further obligations to mitigate and manage harm, including to remove unlawful content that is signalled to the service provider, are unchanged by this amendment.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.

If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.

Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.

This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.

Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to

“prevent individuals from encountering priority illegal content by means of the service”,

it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.

Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a

“duty to operate a service using proportionate systems and processes … to … minimise the length of time”

for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.

My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:

“An offence under any of the following provisions of the Public Order Act 1986”.


One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.

If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Just to reinforce the point the noble Lord, Lord Moylan, made on that, I certainly had experience of where the police became the complainants. They would request, for example, that you take down an English Defence League event, claiming that it would be likely to cause a public order problem. I have no sympathy whatever with the English Defence League, but I am very concerned about the police saying “You must remove a political demonstration” to a platform and citing the legal grounds for doing that. The noble Lord is on to a very valid point to be concerned about that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am grateful to the noble Lord. I really wonder whether the Government realise what they are walking into here. On the one hand, yesterday the Grand Committee was debating the statutory instrument putting in place new statutory guidance for the police on how to enforce, much more sensitively than in the past, non-crime hate incidents. However, on the other hand, the next day in this Chamber we are putting an obligation on a set of mostly foreign private companies to act as a police force to go around bullying us and closing us down if we say something that engages Section 5 of the Public Order Act. I think this is something the Government are going to regret, and I would very much like to hear what my noble friend has to say about that.

Finally, I come to my third group of amendments: Amendments 274, 278, 279 and 283. They are all related and on one topic. These relate to the text of the Bill on page 145, in Clause 170. Here we are discussing what judgments providers have to make when they come to decide what material to take down. Inevitably, they will have to make judgments. That is one of the unfortunate things about this Bill. A great deal of what we do in our lives is going to have to be based on judgments made by private companies, many of which are based abroad but which we are trying to legislate for.

It makes a certain sense that the law should say what they should take account of in making those judgments. But the guidance—or rather, the mandate—given to those companies by Clause 170 is, again, very hair-trigger. Clause 170(5), which I am proposing we amend, states:

“In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is … of the kind in question”.


I am suggesting that “reasonable grounds to infer” should be replaced with “sufficient evidence to infer”, so that they have to be able to produce some evidence that they are justified in taking content down. The test should be higher than simply having “reasonable grounds”, which may rest on a suspicion and little evidence at all. So one of those amendments relates to strengthening that bar so that they must have real evidence before they can take censorship action.

I add only two words to subsection (6), which talks about reasonable grounds for the inference—it defines what the reasonable grounds are—that

“exist in relation to content and an offence if, following the approach in subsection (2)”

and so on. I am saying “if and only if”—in other words, I make it clear that this is the only basis on which material can be censored using the provisions in this section, so as to limit it from going more widely. The third amendment in my group is essentially consequential to that.

15:45
We are all worried in this Committee about prospect of speech being censored in a way which infringes the freedom of speech rights that we fought so hard to establish and which are also embedded in Article 10 of the European Convention on Human Rights. We want to have a legal structure that does not empower providers to act as private sector censors ranging over what we do, except in circumstances where it is wholly justified and in the public interest. The language in the Bill is far too loose for this purpose. It does not give us the protection. It does not do what my noble friend said it would do when he spoke at Second Reading, which is to strike the right balance. These amendments in my third group—and indeed the one in my second group—are there to help strike the right balance. I beg to move.
Lord Bishop of Guildford Portrait The Lord Bishop of Guildford
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 128, 130 and 132, as well as Amendments 143 to 153 in this grouping. They were tabled in the name of my right reverend colleague the Bishop of Derby, who is sorry that she cannot be here today.

The Church of England is the biggest provider of youth provision in our communities and educates around 1 million of our nation’s children. My colleague’s commitment to the principles behind these amendments also springs from her experience as vice chair of the Children’s Society. The amendments in this grouping are intended to strengthen legislation on online grooming for the purpose of child criminal exploitation, addressing existing gaps and ensuring that children are properly protected. They are also intended to make it easier for evidence of children being groomed online for criminal exploitation to be reported by online platforms to the police and the National Crime Agency.

Research from 2017 shows that one in four young people reported seeing illicit drugs advertised for sale on social media—a percentage that is likely to be considerably higher six years on. According to the Youth Endowment Fund in 2022, 20% of young people reported having seen online content promoting gang membership in the preceding 12 months, with 24% reporting content involving the carrying, use or promotion of weapons.

In relation to drugs, that later research noted that these platforms provide opportunities for dealers to build trust with potential customers, with young people reporting that they are more likely to see a groomer advertising drugs as a friend than as a dealer. This leaves young people vulnerable to exploitation, thereby reducing the scruples or trepidation they might feel about buying drugs in the first place. Meanwhile, it is also clear that social media is changing the operation of the county lines model. There is no longer the need to transport children from cities into the countryside to sell drugs, given that children who live in less populated areas can be groomed online as easily as in person. A range of digital platforms is therefore being used to target potential recruits among children and young people, with digital technologies also being deployed—for example, to monitor their whereabouts on a drugs run.

More research is being carried out by the Children’s Society, whose practitioners reported a notable increase in the number of perpetrators grooming children through social media and gaming sites during the first and second waves of the pandemic. Young people were being contacted with promotional material about lifestyles they could lead and the advantages of working within a gang, and were then asked to do jobs in exchange for money or status within this new group. It is true that some such offences could be prosecuted under the Modern Slavery Act 2015, but there remains a huge disparity between the scale of exploitation and the number of those being charged under the Act. Without a definition of child exploitation for criminal purposes, large numbers of children are being groomed online and paying the price for crimes committed by some of their most dangerous and unscrupulous elders.

It is vital that we protect our children from online content which facilitates that criminal exploitation, in the same way that we are looking to protect them from sexual exploitation. Platforms must be required to monitor for illegal content related to child criminal exploitation on their sites and to have mechanisms in place for users to flag it with those platforms so it can be removed. This can be achieved by including modern slavery and trafficking, of which child criminal exploitation is a form, into the scope of illegal content within the Bill, which is what these amendments seek to do. It is also vital that the law sets out clear expectations on platforms to report evidence of child criminal exploitation to the National Crime Agency in the same way as they are expected to report content involving child sexual exploitation and abuse to enable child victims to be identified and to receive support. Such evidence may enable action against the perpetrators without the need of a disclosure from child victims. I therefore fully support and endorse the amendments standing in the name of the right reverend Prelate.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.

I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.

We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.

Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.

Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.

The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.

There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.

There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.

There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.

There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.

The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.

Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.

16:00
There is a really interesting potential role for Ofcom here. One thing that is frustrating if you work at a platform is that you will often get stuck and when you go out and look for advice, you find it is hard to get it. When I ran a working group with some French lawyers, including quite senior judges, they came into the working group saying, “This is all straightforward—you’re just not removing the illegal stuff”. So we gave them real cases and it was interesting to see how half of the lawyers in the room would be on one side, saying “It must come down—it’s against French law” while the other half was saying, “How could you possibly take this down in France?”, because it was protected speech. It is really difficult to get that judgment but, interestingly, an unintended consequence of the Bill may be that Ofcom will ultimately get stuck in that position.
The Bill is not about Ofcom making rulings on individual items of content but if—as in the example I shared with the noble Lord, Lord Moylan, earlier—the police have said to a platform, “You must remove this demonstration. It is illegal”, and the platform said, “No, we judge it not to be illegal”, where are the police going to go? They will go to Ofcom and say, “Look, this platform is breaching the law”, so Ofcom is going to get pulled into that kind of decision-making. I do not envy it that but, again, we need to plan for that scenario because people who complain about illegality will go wherever they think they can get a hearing, and Ofcom will be one of those entities.
A huge amount on this illegal content area still needs to be teased out. I ask the Minister to respond specifically to the points I have raised around whose jurisdiction it is. If the speaker is speaking legally, because they are in a country outside the United Kingdom, what is the Government’s expectation on platforms in those circumstances? Will he look at the issue of the tests and where on this spectrum, from probably illegal through to likely to be illegal and may be illegal, the Government expect platforms to draw the line? If platforms have removed the bad content, will he consider carefully to what extent the Government think that the platforms should have to go through the process of investing time and energy to work out whether they removed it for illegality or for a terms of service breach? That is interesting but if our focus is on safety, frankly, it is wasted effort. We need to question how far we expect the platforms to do that.
Baroness Buscombe Portrait Baroness Buscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, before speaking to my Amendment 137, I want to put a marker down to say that I strongly support Amendment 135 in the name of my noble friend Lord Moylan. I will not repeat anything that he said but I agree with absolutely every word.

Amendment 137 is in my name and that of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This amendment is one of five which I have tabled with the purpose of meeting a core purpose of the Bill. In the words of my noble friend the Minister in response to Amendment 1, it is

“to protect users of all ages from being exposed to illegal content”—[Official Report, 19/4/23; col. 724.]

—in short, to ensure that what is illegal offline is illegal online.

If accepted, this small group of amendments would, I strongly believe, make a really important difference to millions of people’s lives—people who are not necessarily listed in Clause 12. I therefore ask the Committee to allow me to briefly demonstrate the need for these amendments through the prism of millions of people and their families working and living in rural areas. They are often quite isolated and working alone in remote communities, and are increasingly at risk of or are already suffering awful online abuse and harassment. This abuse often goes way beyond suffering; it destroys businesses and a way of life.

I find it extraordinary that the Bill seems to be absent of anything to do with livelihoods. It is all about focusing on feelings, which of course are important—and the most important focus is children—but people’s businesses and livelihoods are being destroyed through abuse online.

Research carried out by the Countryside Alliance has revealed a deeply disturbing trend online that appears to be disproportionately affecting people who live in rural areas and who are involved in rural pursuits. Beyond direct abuse, a far more insidious tactic that activists have adopted involves targeting businesses involved in activities of which they disapprove, such as livestock farming or hosting shoots. They post fake reviews on platforms including Tripadvisor and Google Maps, and their aim is to damage the victim, their business and their reputation by, to put it colloquially, trashing their business and thereby putting off potential customers. This is what some call trolling.

Let me be clear that I absolutely defend, to my core, the right to freedom of expression and speech, and indeed the right to offend. Just upsetting someone is way below the bar for the Bill, or any legislation. I am deeply concerned about the hate crime—or non-crime—issue we debated yesterday; in fact, I put off reading the debate because I so disagree with this nonsense from the College of Policing.

Writing a negative review directly based on a negative experience is entirely acceptable in my book, albeit unpleasant for the business targeted. My amendments seek to address something far more heinous and wrong, which, to date, can only be addressed as libel and, therefore, through the civil courts. Colleagues in both your Lordships’ House and in another place shared with me tremendously upsetting examples from their constituents and in their neighbourhoods of how anonymous activists are ruining the lives of hard-working people who love this country and are going the extra mile to defend our culture, historic ways of life and freedoms.

Fortunately, through the Bill, the Government are taking an important step by introducing a criminal offence of false communications. With the leave of the Committee, I will briefly cite and explain the other amendments in order to make sense of Amendment 137. One of the challenges of the offence of false communications is the need to recognise that so much of the harm that underpins the whole reason why the Bill is necessary is the consequence of allowing anonymity. It is so easy to destroy and debilitate others by remaining anonymous and using false communications. Why be anonymous if you have any spine at all to stand up for what you believe? It is not possible offline—when writing a letter to a newspaper, for example—so why is it acceptable online? The usual tech business excuse of protecting individuals in rogue states is no longer acceptable, given the level of harm that anonymity causes here at home.

Therefore, my Amendment 106 seeks to address the appalling effect of harm, of whatever nature, arising from false or threatening communications committed by unverified or anonymous users—this is what we refer to as trolling. Amendments 266 and 267, in my name and those of my noble and learned friend Lord Garnier and my noble friend Lord Leicester, would widen the scope of this new and welcome offence of false communications to include financial harm, and harm to the subject of the false message arising from its communication to third parties.

The Bill will have failed unless we act beyond feelings and harm to the person and include loss of livelihood. As I said, I am amazed that it is not front and centre of the Bill after safety for our children. Amendment 268, also supported by my noble and learned friend, would bring within the scope of the communications offences the instigation of such offences by others—for example, Twitter storms, which can involve inciting others to make threats without doing so directly. Currently, we are unsure whether encouraging others to spread false information—for example, by posting fake reviews of businesses for ideologically motivated reasons—would become an offence under the Bill. We believe that it should, and my Amendment 268 would address this issue.

I turn briefly to the specifics of my Amendment 137. Schedule 7 lists a set of “priority offences” that social media platforms must act to prevent, and they must remove messages giving rise to certain offences. However, the list does not include the new communications offences created elsewhere in Part 10. We believe that this is a glaring anomaly. If there is a reason why the new communications offences are not listed, it is important that we understand why. I hope that my noble friend the Minister can explain.

The practical effect of Amendment 137 would be to include the communications offences introduced in the Bill and communications giving rise to them within the definition of “relevant offence” and “priority illegal content” for the purposes of Clause 53(4) and (7) and otherwise.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I ask the Committee to have a level of imagination here because I have been asked to read the speech of the noble Viscount, Lord Colville—

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

I do not know who advised the noble Baroness—and forgive me for getting up and getting all former Leader on her—but this is a practice that we seem to have adopted in the last couple of years and that I find very odd. It is perfectly proper for the noble Baroness to deploy the noble Viscount’s arguments, but to read his speech is completely in contravention of our guidance.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.

The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.

I will make a couple of points on that thought. Clause 170(6) directs that a provider must have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,

but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.

If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.

If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.

I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I support the amendments in this group that probe how removing illegal material is understood and will be used under the Bill. The noble Lord, Lord Moylan, explained a lot of my concerns, as indeed did the noble Viscount, Lord Colville, via his avatar. We have heard a range of very interesting contributions that need to be taken seriously by the Government. I have put my name to a number of amendments.

The identification of illegal material might be clear and obvious in some cases—even many cases. It sounds so black and white: “Don’t publish illegal material”. But defining communications of this nature can be highly complex, so much so that it is traditionally reserved for law enforcement bodies and the judicial system. We have already heard from the noble Lord, Lord Moylan, that, despite Home Secretaries, this House, regulations and all sorts of laws having indicated that non-crime hate incidents, for example, should not be pursued by the police, they continue to pursue them as though they are criminal acts. That is exactly the kind of issue we have.

16:15
I noted earlier that the noble Lord, Lord Bethell, made a passionate intervention about, of all things, Andrew Tate and his illegality in relation to this Bill. That prompted me to think a number of things. Andrew Tate is an influencer who I despise, as I do the kind of things he says. But, as far as I know, the criminal allegations he faces are not yet resolved, so he has to be seen as innocent until proven guilty. Most of what he has online that is egregious might well be in bad taste, as people say—I would say that it is usually misogynist—but it is not against the law. If we get to a situation where that is described as illegality, that is the kind of thing that I worry about. As we have heard from other noble Lords, removing so-called illegal content for the purpose of complying with this regulatory system will mean facing such dilemmas.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

In talking about individuals and investigations, the noble Baroness reminded me of one class of content where we do have clarity, and that is contempt of court. That is a frequent request. We know that it is illegal in that case because a judge writes to the company and says, “You must not allow this to be said because it is in contempt of court”, but that really is the exception. In most other cases, someone is saying, “I think it is illegal”. In live proceedings, in most cases it is absolutely clear because a judge has told you.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

That is very helpful.

I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.

The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.

Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.

Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.

I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?

The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.

I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.

There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.

It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.

What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.

I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

It is a great honour to follow my noble friend. I completely agree with her that this is a powerful discussion and there are big problems in this area. I am grateful also to my noble friend Lord Moylan for raising this in the first place. It has been a very productive discussion.

I approach the matter from a slightly different angle. I will not talk about the fringe cases—the ones where there is ambiguity, difficulty of interpretation, or responsibility or regulatory override, all of which are very important issues. The bit I am concerned about is where primary priority content that clearly demonstrates some kind of priority offence is not followed up by the authorities at all.

The noble Lord, Lord Allan, referred to this point, although he did slightly glide over it, as though implying, if I understood him correctly, that this was not an area of concern because, if a crime had clearly been committed, it would be followed up on. My fear and anxiety is that the history of the internet over the last 25 years shows that crimes—overt and clear crimes that are there for us to see—are very often not followed up by the authorities. This is another egregious example of where the digital world is somehow exceptionalised and does not have real-world rules applied to it.

16:30
The noble Baroness, Lady Fox, quite reasonably asked me about Andrew Tate. That matter is sub judice; the noble Lord, Lord Allan, referred to it and I do not want to drag the conversation into dangerous legal territory. However, she makes the good point that we sometimes see, particularly in the online abuse of women, offences that are quite clearly crimes; they are crimes of rape, violent abuse and child abuse. It would not take any of us long to find videos that showed clear examples of crime, but very often they are not followed up with the energy and determination that they could or should be, because things on the internet somehow do not seem to touch the authorities in the way they should do.
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services—
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I want to clarify one point. I have had a slightly different experience, which is that for many people—women, at least—whom I have talked to recently, there is an over-enthusiasm and an over-zealous attitude to policing the speech of particular women and, as we have already heard, gender-critical women. It is often under the auspices of hate speech and there is all sorts of discussion about whether the police are spending too long trawling through social media. By contrast, if you want to get a policeman or policewoman involved in a physical crime in your area, you cannot get them to come out. So I am not entirely convinced. I think policing online speech at least is taking up far too much of the authorities’ time, not too little time, and distracting them from solving real social and criminal activity.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.

So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.

If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, I would like to mention one issue that I forgot to mention, and I think it would be more efficient to pose the question now to the Minister rather than interject when he is speaking.

On the Government’s Amendments 136A, 136B and 136C on the immigration offences, the point I want to make is that online services can be literal life-savers for people who are engaged in very dangerous journeys, including journeys across the Channel. I hope the Minister will be clear that the intention here is to require platforms to deal only with content, for example, from criminals who are offering trafficking services, and that there is no intention to require platforms somehow to withdraw services from the victims of those traffickers when they are using those services in the interest of saving their own lives or seeking advice that is essential to preserving their own safety.

That would create—as I know he can imagine—real ethical and moral dilemmas, and we should not be giving any signal that we intend to require platforms to withdraw services from people who are in desperate need of help, whatever the circumstances.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My Lords, as well as throwing up some interesting questions of law, this debate has provoked some interesting tongue-twisters. The noble Lord, Lord Allan of Hallam, offered a prize to the first person to pronounce the Netzwerkdurchsetzungsgesetz; I shall claim my prize in our debate on a later group when inviting him to withdraw his amendment.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that would be welcome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Can I suggest one of mine?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

16:45
My noble friend Lord Moylan’s first group within a group—Amendments 17 and 18—alters the duties in Clause 9 of the Bill. These amendments would weaken the illegal content duties by removing any obligation on services to take upstream measures to remove illegal content, including child sexual abuse material. They would therefore seriously undermine the Bill’s focus on proactive risk management. Similarly, Amendments 272 to 283 seek to alter how services should judge what is illegal. I understand that noble Lords are concerned, rightly, about the over-removal of content.
The amendments tabled by the noble Lord, Lord Clement-Jones, would require providers to have sufficient evidence that content is illegal before taking action against it, replacing the current test of “reasonable grounds to infer”. Sufficient evidence is a subjective measure. We have discussed the difficulties for those who must make these decisions and we think that this formulation would set an unclear threshold for providers to determine how they should judge illegality, which could result in the under-removal of illegal content, putting users at risk, or the over-removal of it, with adverse consequences for freedom of expression.
The amendments tabled by my noble friend Lord Moylan would narrow the test to require the removal only of content which, based on all reasonably available contextual evidence, is manifestly illegal, and we think that that threshold is too high. Context and analysis can give a provider good reasons to infer that content is illegal even though the illegality is not immediately obvious. This is the case with, for example, some terrorist content which is illegal only if shared with terrorist purposes in mind, and intimate image abuse, where additional information or context is needed to know whether content has been posted against the subject’s wishes.
Amendment 276 would remove the detail in Clause 170 that specifies the point at which providers must treat content as illegal or fraudulent. That would enable providers to interpret their safety duties in broader ways. Rather than having greater discretion, Ofcom would be given less certainty about whether it could successfully take enforcement action. I take the point raised by noble Lords about the challenges of how platforms will identify illegal content, and I agree with my noble friend Lady Stowell that the contributions of noble and learned Lords would be helpful in these debates as well. However, Clause 170 sets out how companies should determine whether or not content is illegal or an advertisement is fraudulent. I will say a little more about the context behind that, as the noble Lord, Lord Allan, may have a question.
The Bill recognises that it will often be difficult for providers to make judgments about content without considering the context. Clause 170 therefore clarifies that providers must ascertain whether, on the basis of on all reasonably available information, there are reasonable grounds to infer that all the relevant elements of the offence—including the mental elements—are present and that no defence is available. The amount of information that would be reasonably available to a particular service provider will depend on the size and capacity of the provider, among other factors.
Companies will need to ensure that they have effective systems to enable them to check the broader context relating to content when deciding whether or not to remove it. This will provide greater certainty about the standard to be applied by providers when assessing content, including judgments about whether or not content is illegal. We think that protects against over-removal by making it clear that platforms are not required to remove content merely on the suspicion of it being illegal. Beyond that, the framework also contains provisions about how companies’ systems and processes should approach questions of mental states and defences when considering whether or not content is an offence in the scope of the Bill.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I am struggling a little to understand why the Minister thinks that sufficient evidence is subjective, and therefore, I assume, reasonable grounds to infer is objective. Certainly, in my lexicon, evidence is more objective than inference, which is more subjective. I was reacting to that word. I am not sure that he has fully made the case as to why his wording is better.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Or indeed any evidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.

The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.


I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.

Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.

These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.

Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.

My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.

Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.

The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.

17:00
Government Amendment 136C will add the offence of foreign interference to the list of priority offences in the Bill. As your Lordships will know, the Government previously made an amendment via the National Security Bill to include this offence in this Bill. Because of the relative pace at which these two Bills are now passing through Parliament, we are now doing it directly in the Online Safety Bill.
My noble friend Lady Buscombe’s Amendment 137 seeks to list the false and threatening communication offences in Schedule 7. Listing the communication offences as priority offences would require platforms to identify and determine the illegality of such content proactively. I appreciate the reasons she set out for raising this issue, but as these offences rely heavily on a user’s mental state, it would be challenging for services to identify this content without significant additional context. Let me reassure her, however, that platforms will still need to have systems and processes in place to remove this content quickly when it is reported to them, as with all other illegal content which is not in Schedule 7.
My noble friend Lord Bethell anticipated later debates on age verification and pornography. If he permits, I will come back on his points then. I have noted his question for that discussion as well as the question from the noble Lord, Lord Stevenson, on financial scams and fraud, which we will have the chance to discuss in full. I am not sure if my noble friend Lord Moylan wants to ask a further question at this juncture or to accept a reassurance that I will consult the Official Report and write on any further points he raised which I have not dealt with.
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, it is genuinely difficult to summarise such a wide-ranging debate, which was of a very high standard. Only one genuinely bright idea has emerged from the whole thing: as we go through Committee, each group of amendments should be introduced by the noble Lord, Lord Allan of Hallam, because it is only after I have heard his contribution on each occasion that I have begun to understand the full complexity of what I have been saying. I suspect I am not alone in that and that we could all benefit from hearing the noble Lord before getting to our feet. That is not meant to sound the slightest bit arch; it is absolutely genuine.

The debate expressed a very wide range of concerns. Concerns about gang grooming and recruiting were expressed on behalf of the right reverend Prelate the Bishop of Derby and my noble friend Lady Buscombe expressed concerns about trolling of country businesses. However, I think it is fair to say that most speakers focused on the following issues. The first was the definition of legality, which was so well explicated by the noble Lord, Lord Allan of Hallam. The second was the judgment bar that providers have to pass to establish whether something should be taken down. The third was the legislative mandating of private foreign companies to censor free speech rights that are so hard-won here in this country. These are the things that mainly concern us.

I was delighted that I found myself agreeing so much with what the noble Baroness, Lady Kidron, said, even though she was speaking in another voice or on behalf of another person. If her own sentiments coincide with the sentiments of the noble Viscount—

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I am sorry to intrude, but I must say now on the record that I was speaking on my own behalf. The complication of measuring and those particular things are terribly important to establish, so I am once again happy to agree with the noble Lord.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am delighted to hear the noble Baroness say that, and it shows that that pool of common ground we share is widening every time we get to our feet. However, the pool is not particularly widening, I am afraid to say—at least in respect of myself; other noble Lords may have been greatly reassured—as regards my noble friend the Minister who, I am afraid, has not in any sense addressed the issues about free speech that I and many other noble Lords raised. On some issues we in the Committee are finding a consensus that is drifting away from the Minister. We probably need to put our heads together more closely on some of these issues with the passage of time in Committee.

My noble friend also did not say anything that satisfied me in respect of the practical operation of these obligations for smaller sites. He speaks smoothly and persuasively of risk-based proactive approaches without saying that, for a large number of sites, this legislation will mean a complete re-engineering of their business model. For example, where Wikipedia operates in a minority language, such as in Welsh Wikipedia, which is the largest Welsh language website in the world, if its model is to involve monitoring what is put out by the community and correcting it as it goes along, rather than having a model in advance that is designed to prevent things being put there in the first place, then it is very likely to close down. If that is one of the consequences of this Bill the Government will soon hear about it.

Finally, although I remain concerned about public order offences, I have to say to the Minister that if he is so concerned about the dissemination of alarm among the population under the provisions of the Public Order Act, what does he think that His Majesty’s Government were doing on Sunday at 3 pm? I beg leave to withdraw the amendment.

Amendment 17 withdrawn.
Amendment 18 not moved.
Amendment 18A
Moved by
18A: Clause 9, page 8, line 23, at end insert—
“(8A) A duty to summarise in the terms of service the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 1 services to summarise (in their terms of service) the findings of their latest risk assessment regarding illegal content and activity. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
Amendment 18A agreed.
Clause 9, as amended, agreed.
Clause 10: Children’s risk assessment duties
Amendment 19 not moved.
Lord Beith Portrait The Deputy Chairman of Committees (Lord Beith) (LD)
- Hansard - - - Excerpts

If Amendment 20 is agreed, I cannot call Amendment 21 by reason of pre-emption.

Amendment 20

Moved by
20: Clause 10, page 9, line 11, leave out paragraphs (a) to (h) and insert—
“(a) the level of risk that children who are users of the service encounter the harms as outlined in Schedule (Online harms to children) by means of the service;(b) any of the level of risks to children encountered singularly or in combination, having regard to—(i) the design of functionalities, algorithms and other features that present or increase risk of harm, such as low-privacy profile settings by default;(ii) the business model, revenue model, governance, terms of service and other systems and processes or mitigation measures that may reduce or increase the risk of harm;(iii) risks which can build up over time;(iv) the ways in which level of risks can change when experienced in combination with others;(v) the level of risk of harm to children in different age groups;(vi) the level of risk of harm to children with certain characteristics or who are members of certain groups; and(vii) the different ways in which the service is used including but not limited to via virtual and augmented reality technologies, and the impact of such use on the level of risk of harm that might be suffered by children;(c) whether the service has shown regard to the rights of children as set out in the United Nations Convention on the Rights of the Child (see general comment 25 on children’s rights in relation to the digital environment).”Member’s explanatory statement
This amendment would require providers to look at and assess risks on their platform in the round and in line with the 4 Cs of online risks to children (content, contact, conduct and contractual/commercial risks). Although these risks will not be presented on every service, this amendment requires providers to reflect on these risks, so they are not forgotten and can be built into future development of the service.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, this amendment and Amendments 74, 93 and 123 are part of a larger group that have been submitted as a package loosely referred to as the AV and harms package. They have been the subject of much private debate with the Government, for which we are grateful, and among parliamentarians, and have featured prominently in the media. The amendments are in my name and those of the noble Lord, Lord Bethell, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, but enjoy the support of a vast array of Members of both Houses. I thank all those who have voiced their support.

The full package of amendments defines and sets out the rules of the road for age assurance, including the timing of its introduction, and the definition of terms such as age verification and age assurance. They introduce the concept of measuring the efficacy of systems with one eye on the future so that we as parliamentarians can indicate where and when we feel that proportionality is appropriate and where it is simply not—for example, in relation to pornography. In parallel, we have developed a schedule of harms, which garners rather fewer column inches but is equally important in establishing Parliament’s intention. It is that schedule of harms that is up for debate today.

Before I lay out the amendment, I thank the 26 children’s charities which have so firmly got behind this package and acknowledge, in particular, Barnardo’s, CEASE and 5Rights, of which I am chair, which have worked tirelessly to ensure that the full expertise of children’s charities has been embedded in these amendments. I also pay tribute to the noble Baroness, Lady Benjamin, who in this area of policy has shown us all the way.

The key amendment in this group is Amendment 93, which would place a schedule of harms to children in the Bill. There are several reasons for doing so, the primary one being that by putting them in the Bill we are stating the intention of Parliament, which gives clarity to companies and underlines the authority of Ofcom to act on these matters. Amendments 20, 74 and 123 ensure that the schedule is mirrored in risk assessments and tasks Ofcom with updating its guidance every six months to capture new and emerging harms, and as such are self-evident.

The proposed harms schedule is centred around the four Cs, a widely used and understood taxonomy of harm used in legislation and regulation around the globe. Importantly, rather than articulate individual harms that may change over time, it sets its sight on categories of harm: content, contact, conduct and contract, which is sometimes referred to as commercial harm. It also accounts for cumulative harms, where two or more risk factors create a harm that is greater than any single harm or is uniquely created by the combination. The Government’s argument against the four Cs is that they are not future-proof, which I find curious since the very structure of the four Cs is to introduce broad categories of harm to which harms can be added, particularly emerging harms. By contrast, the Government are adding an ever-growing list of individual harms.

I wish to make three points in favour of our package of amendments relating first to language, secondly to the nature of the digital world, and finally to clarity of purpose. It is a great weakness of the Bill that it consistently introduces new concepts and language—for example, the terms “primary priority content”, “priority content” and “non-designated content”. These are not terms used in other similar Bills across the globe, they are not evident in current UK law and they do not correlate with established regimes, such as equalities legislation or children’s rights under the convention, more of which in group 7.

The question of language is non-trivial. It is the central concern of those who fight CSAE around the world, who frequently find that enforcement against perpetrators or takedown is blocked by legal systems that define child sexual abuse material differently—not differently in some theoretical sense but because the same image can be categorised differently in two countries and then be a barrier to enforcement across jurisdictions. Leadership from WeProtect, the enforcement community and representatives that I recently met from Africa, South America and Asia have all made this point. It undermines the concept of UK leadership in child protection that we are wilfully and deliberately rejecting accepted language which is embedded in treaties, international agreements and multilateral organisations to start again with our own, very likely with the same confused outcome.

Secondly, I am concerned that while both the Bill and the digital world are predicated on system design, the harms are all articulated as content with insufficient emphasis on systems harms, such as careless recommendations, spreading engagement and the sector-wide focus on maximising engagement, which are the very things that create the toxic and dangerous environment for children. I know, because we have discussed it, that the Minister will say that this is all in the risk assessment, but the risk assessment asks regulated companies to assess how a number of features contribute to harm, mostly expressed as content harm.

What goes through my mind is the spectre of Meta’s legal team, which I watched for several days during Molly Russell’s inquest; they stood in a court of law and insisted that hundreds, in fact thousands, of images of cut bodies and depressive messages did not constitute harm. Rather, they regarded them as cries for help or below the bar of harm as they interpreted it. Similarly, there was material that featured videos of people jumping off buildings—some of them sped-up versions of movie clips edited to suggest that jumping was freedom—and I can imagine a similar argument that says that kind of material cannot be considered harmful, because in another context it is completely legitimate. Yet this material was sent to Molly at scale.

17:15
It is not good enough to characterise harms simply by establishing what is or is not harmful content. The previous debate really underlined that it takes a long time and it is very complicated to see what is harmful. But we must make utterly clear that the drip feed of nudges, enticements and recommendations and the creation of a toxic environment, overwhelming a child of 14 with more than 1,400 messages, whether they meet that bar of harmful content or not, is in itself a harm. A jukebox of content harms is not future-proof, and it fails to name the risks of the system. It is to misunderstand where the power of digital design actually lies.
Finally, there is the question of simplicity and clarity. As we discussed on the first day of Committee, business wants clarity, campaigners want clarity, parents want clarity, and Ofcom could do with some clarity. If not the four Cs, my challenge to the Government is to deliver a schedule that has the clarity and simplicity of the amendments in front of us, in which harm is defined by category not by individual content measurements, so that it is flexible now and into the future, and foregrounds the specific role of the system design not only as an accomplice to the named harm but as a harm itself. I beg to move.
Baroness Ritchie of Downpatrick Portrait Baroness Ritchie of Downpatrick (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I have listened intently today, and there is no doubt that this Bill not only presents many challenges but throws up the complexity of the whole situation. I think it was the noble Lord, Lord Kamall, in an earlier group who raised the issues of security, safety and freedom. I would add the issue of rights, because we are trying to balance all these issues and characterise them in statute, vis-à-vis the Bill.

On Tuesday, we spoke about one specific harm—pornography—on the group of amendments that I had brought forward. But I made clear at that time that I believe this is not the only harm, and I fully support the principles of the amendments from the noble Baroness, Lady Kidron. I would obviously like to get some clarity from her on the amendments, particularly as to how they relate to other clauses in the Bill.

The noble Baroness has been the pioneer in this field, and her expertise is well recognised across the House. I believe that these amendments really take us to the heart of the Bill and what we are trying to achieve—namely, to identify online harms to children, counteract them and provide a level of safety to young people.

As the noble Lord, Lord Clement-Jones, said on Tuesday,

“there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us”.—[Official Report, 25/4/23; col. 1196.]

There is actually not that much between us. I fully agree with the principle of putting some of the known harms to children in the Bill. If we know the harms, there is little point in waiting for them to be defined in secondary legislation by Clause 54.

It is clear to me that there are harms to children that we know about, and those harms will not change. It would be best to name those harms clearly in the Bill when it leaves this House. That would allow content providers, search engines and websites in scope of the Bill to prepare to make any changes they need to keep children safe. Perhaps the Minister could comment on that aspect. We also know that parents will expect some harms to be in the Bill. The noble Baroness, Lady Kidron, laid out what they are, and I agree with her analysis. These issues are known and we should not wait for them to be named.

While known harms should be placed into the Bill, I know, understand and appreciate that the Government are concerned about future-proofing. However, I am of the view that a short list of key topics will not undermine that principle. Indeed, the Joint Committee’s report on the draft Bill stated,

“we recommend that key, known risks of harm to children are set out on the face of the Bill”.

In its report on the Bill, the DCMS Select Committee in the other place agreed, saying

“that age-inappropriate or otherwise inherently harmful content and activity (like pornography, violent material, gambling and content that promotes or is instructive in eating disorders, self-harm and suicide) should appear on the face of the Bill”.

Has there been any further progress in discussions on those issues?

At the beginning of the year, the Children’s Commissioner urged Parliamentarians

“to define pornography as a harm to children on the fact of the … Bill, such that the regulator, Ofcom, may implement regulation of platforms hosting adult content as soon as possible following the passage of the Bill”.

I fully agree with the Children’s Commissioner. While the ways in which pornographic content is delivered will change over time, the fact that pornography is harmful to children will not change. Undoubtedly, with the speed of technology—something that the noble Lord, Lord Allan of Hallam, knows a lot more about than the rest of us, having worked in this field—it will no doubt change and we will be presented with new types of challenges.

I therefore urge the Government to support the principle that the key risks are in the Bill, and I thank the noble Baroness, Lady Kidron, for raising this important principle. However, I hope she will indulge me as I seek to probe some of the detail of her amendments and their interactions with the architecture of other parts of the Bill. As I said when speaking to Clause 49 on Tuesday, the devil is obviously in the detail.

First, Clause 54 defines what constitutes

“Content that is harmful to children”,

and Clause 205 defines harm, and Amendment 93 proposes an additional new list of harms. As I have already said, I fully support the principle of harms being in the Bill, but I raise a question for the noble Baroness. How does she see these three definitions working together? That might refer back to a preliminary discussion that we had in the tearoom earlier.

These definitions of harms are in addition to the content to be defined as primary priority content and priority content. Duties in Clauses 11 and 25 continue to refer to these two types of content for Part 3 services, but Amendments 20 and 74 would remove the need for risk assessments in Clauses 10 and 24 to address these two types of content. It seems that the amendments could create a tension in the Bill, and I am interested to ascertain how the noble Baroness, Lady Kidron, foresees that tension operating. Maybe she could give us some detail in her wind-up about that issue. An explanation of that point may bring some clarity to understanding how the new schedule that the noble Baroness proposes will work alongside the primary priority content and the priority content lists. Will the schedule complement primary priority content, or will it be an alternative?

Secondly, as I said, some harms are known but there are harms that are as yet unknown. Will the noble Baroness, Lady Kidron, consider a function to add to the list of content in her Amendment 93, in advance of us coming back on Report? There is no doubt that the online space is rapidly changing, as this debate has highlighted. I can foresee a time when other examples of harm should be added to the Bill. I accept that the drafting is clear that the list is not exclusive, but it is intended to be a significant guide to what matters to the public and Parliament. I also accept that Ofcom can provide guidance on other content under Amendment 123, but, without a regulatory power added to Amendment 93, it feels that we are perhaps missing a belt-and-braces approach to online harms to children. After all, our principal purpose here is to protect children from online harm.

I commend the noble Baroness, Lady Kidron, on putting these important amendments before the Committee, and I fully support the principle of what she seeks to achieve. But I hope that, on further reflection, she will look at the points I have suggested. Perhaps she might suggest other ideas in her wind-up, and we could have further discussions in advance of Report. I also look forward to the Minister’s comments on these issues.

Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, I support Amendments 20, 93 and 123, in my name and those of the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Stevenson. I also support Amendment 74 in the name of the noble Baroness, Lady Kidron. I pay tribute to the courage of all noble Lords and their teams, and of the Minister and the Bill team, for their work on this part of the Bill. This work involves the courage to dare to look at some very difficult material that, sadly, shapes the everyday life of too many young people. This group of amendments is part of a package of measures to strengthen the protections for children in the Bill by introducing a new schedule of harms to children and plugging a chronological gap between Part 3 and Part 5 services, on when protection from pornography comes into effect.

Every so often in these debates, we have been reminded of the connection with real lives and people. Yesterday evening, I spent some time speaking on the telephone with Amanda and Stuart Stephens, the mum and dad of Olly Stephens, who lived in Reading, which is part of the diocese of Oxford. Noble Lords will remember that Olly was tragically murdered, aged 13, in a park near his home, by teenagers of a similar age. Social media played a significant part in the investigation and in the lives of Olly and his friends—specifically, social media posts normalising knife crime and violence, with such a deeply tragic outcome.

17:30
Last year in June, “Panorama” dared to look into this world. The programme revealed the depth and extent of the normalisation of knives and knife crime in posts offered to young people. I was struck by the comments of Frances Haugen, filmed when she met Stuart and Amanda. She said that each of us sees social media through a pinhole: a tiny snapshot of the total content. We have no idea how much darkness and evil are shaping children and young people, destroying their sense of proportion and deeply affecting offline behaviour. The only group that has the whole picture, of course, is the companies themselves.
The noble Baroness, Lady Kidron, and others have outlined the remarkable degree of support for this raft of amendments from charities working to protect children. We should listen. These amendments will ensure a much wider definition of “harm” and will again future-proof the Bill in terms of technology which is even now coming over the horizon.
The Center for Countering Digital Hate speaks about an arms race to devise ever more effective ways of keeping users’ attention, even if it means putting them at risk. Its researchers set up new accounts in the United States, United Kingdom, Canada and Australia at the minimum age TikTok allows: 13 years old. Those accounts paused briefly on videos about body image and mental health and liked them. What the researchers found was deeply disturbing. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content relating to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens. CCDH researchers found a community for eating disorder content on the platform amassing 13.2 billion views across 56 hashtags, often designed to evade moderation.
As the noble Baroness, Lady Kidron, said, this fourfold classification of harms to children is being adapted elsewhere in the world, including the European Union. The schedule in the amendment gives clear but non-exhaustive examples to guide service providers on the meaning of each of the four Cs. It is vital to have more comprehensive agreed definitions of harm in the Bill.
I will reflect for a moment on what each of the four Cs means. Content harms are the most familiar. At the moment, children who go online are likely to encounter age-inappropriate content, including violent, gory and graphic communication, hate speech, terrorism, online prostitution, drugs, eating disorders and self-harm. Research also shows that exposure to different types of harmful content is interrelated: so, if a child reports seeing one type of disturbing content, it is likely that they have seen others as well.
Secondly, contact harms encourage harmful actions in the non-virtual world. A 10 year-old girl was left with burns after spraying an aerosol deodorant with the nozzle right up against her skin to create a freezing sensation. Jane Platt’s daughter Sarah, aged 15, was rushed to hospital in February 2020 after doing the “skull-breaker challenge”, which involves two people kicking the legs from under a third, making them fall over. These suggestions could never be offered in young people’s magazines or broadcast media.
Thirdly, there are conduct harms. In a global survey, 54% of young people—57% of girls and 48% of boys—reported having experienced online sexual harms before they were 18 years old, including within interaction with adults and being asked something sexually explicit or being sent sexually explicit content.
Finally, there are commercial harms. Over half of the games on Google Play now include loot boxes and more than 93% of games that feature loot boxes are marked suitable for children aged 12 years-plus.
As the noble Baroness, Lady Kidron, and others have argued, these harms are often cumulative and interrelated. The social media companies are the only ones not looking through a keyhole but monitoring social media in the round and able to assess what is happening, but evidence suggests that they will do not so until compelled by legislation. These amendments are a vital step forward in fulfilling the Bill’s purpose of providing additional protection from harm for children. I urge the Government to adopt them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.

Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the

“widely understood and used 4 Cs of online risk to children”.

They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.

I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.

I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.

As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.

I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?

The other thing is that Amendment 93 says:

“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.


As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.

In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.

The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.

This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.

The amendment states that the Bill should target any platform that posts

“links to, or … encourages child users to seek”

out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.

17:45
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Let us leave it there.

Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.

The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:

“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]


This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.

The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.

Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.

It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.

The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I restate my commitment to Amendments 20, 93 and 123, which are in my name and those of the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford, and the noble Lord, Lord Stevenson, and the noble Baroness’s Amendment 74. It is a great honour to follow the noble Lord, Lord Knight. He put extremely well some key points about where there are gaps in the existing Bill. I will build on why we have brought forward these amendments in order to plug these gaps.

In doing so, I wish to say that it has been a privilege to work with the right reverend Prelate, the noble Baroness and the noble Lord, Lord Stevenson. We are not from the same political geographies, but that collaboration demonstrates the breadth of the political concern, and the strength of feeling across the Committee, about these important gaps when it comes to harms—gaps that, if not addressed, will put children at great risk. In this matter we are very strongly united. We have been through a lot together, and I believe this unlikely coalition demonstrates how powerful the feelings are.

It has been said before that children are spending an increasing amount of their lives online. However, the degree of that inflection point in the last few years has been understated, as has how much further it has got to go. The penetration of mobile phones is already around 75% of 10 year-olds—it is getting younger, and it is getting broader.

In fact, the digital world is totally inescapable in the life of a child, whether that is for a young child who is four to six years old or an older child who is 16 or 17. It is increasingly where they receive their education—I do not think that is necessarily a good thing, but that is arguable—it is where they establish and maintain their personal relationships and it is a key forum for their self-expression.

For anyone who suspects otherwise, I wish to make it clear that I firmly believe in innovation and progress, and I regard the benefits of the digital world as really positive. I would never wish to prevent children accessing the benefits of the internet, the space it creates for learning and building community, and the opportunities it opens for them. However, environments matter. The digital world is not some noble wilderness free from original sin or a perfect, frictionless marketplace where the best, nicest, and most beautiful ideas triumph. It is a highly curated experience defined by the algorithms and service agreements of the internet companies. That is why we need rules to ensure that it is a safe space for children.

I started working on my first internet business in 1995, nearly 30 years ago. I was running the Ministry of Sound, and we immediately realised that the internet was an amazing way of getting through to young people. Our target audiences were either clubbers aged over 18 or the younger brothers and sisters of clubbers who bought our merchandise. The internet gave us an opportunity to get past all the normal barriers—past parents and regulation to reach a wonderful new market. I built a good business and it worked out well for me, but those were the days before GDPR and what we understand from the internet. I know from my experience that we need to ensure that children are protected and shielded from the harms that bombard them, because there are strong incentives—mainly financial but also other, malign incentives—for bad actors to use the internet to get through to children.

Unfortunately, as the noble Baroness, Lady Kidron, pointed out, the Bill as it stands does not achieve that aim. Take, for example, contact harms, such as grooming and child sexual abuse. In February 2020, Bark, a US-based organisation that helps families manage and protect their children’s digital lives, launched an 11 year-old persona online who it called Bailey. Bailey’s online persona clearly shows that she is an ordinary 11 year-old, posting content that is ordinary for an 11 year-old. Within 30 seconds of her persona being launched online she received a like from a man whose profile picture was a penis. Within two minutes, multiple messages were received from men, and within five minutes a video call. Shortly afterwards, she received requests from men to meet up. I remind your Lordships that Bailey was 11 years old. These are not trivial content harms; these are attempts to contact a minor using the internet as a medium.

18:00
The Bill does try to address contact harms. I am supportive of the Bill and its principles, and am a big fan of the team that is trying to drive it through Parliament. For example, the Bill specifies that features that enable adults to search for and contact children must be risk-assessed. However, that is where the Bill currently stops. There is no comprehensive list of harms or features to inform that risk assessment. For instance, a feature such as live-streaming, which can enable adults to access children directly, is not specifically referenced, meaning that there is no explicit obligation for services to risk-assess such features—a big gap. Services cannot be expected to read between the lines of what the Government’s intentions may be. We must be explicit and clear in the Bill if we are serious about delivering on children’s safety.
That is why our amendments would do four things. First, they would introduce into the Bill a new schedule of harms to children, framed around the four categories of risk: content, contact, conduct and contract or commercial. This will ensure that harms to children in the Bill reflect the full range of harms that children encounter online, including from design features that facilitate pathways to harm to pornography, self-harm, pro-suicide content and grooming. Secondly, we seek to ensure that services must risk-assess for the harms listed in this proposed new schedule in order to ensure that these risk assessments are comprehensive. Thirdly, we seek to task Ofcom with producing guidance, to be updated every 12 months, on new and emerging harms. Fourthly, we seek to ensure that Ofcom consults with children’s advocates and charities when producing this guidance.
The timing of this is very important. The real-life Bailey cannot wait for harms to be outlined in secondary legislation. These problems have been around for a long time but, as I said, the inflection curve on technology is shooting right up the hockey stick at the moment. If the primary purpose of the Bill is to protect children, it must include the harms that children face every day in primary legislation—not in secondary legislation, not after Royal Assent, not in a year or two, but now. Our Amendment 93 would introduce in the Bill a schedule of harms to children. This non-exhaustive list would produce a robust and future-proof framework, and have the clarity and mandate to produce comprehensive risk assessments to ensure that the Bill delivers on its chief purpose: protecting children.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.

I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.

The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.

One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.

Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.

I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.

I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.

I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.

I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.

I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.

The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.

I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.

The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.

That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.

I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.

In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.

As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.

It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.

The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.

It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.

Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.

18:15
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.

Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.

Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.

I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.

It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.

I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.

So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that

“primary priority content harmful to children”

will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?

18:30
I do not want to make my speech twice, but in my final sentence I said that my challenge to the Government is to have a very simple way forward by other means, if those things were articulated, but my understanding is that they are to bring forward content harms that describe only content as we normally believe it.
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.

For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.

Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.

This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.

The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.

The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.

I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.

Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.

I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.

Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.

We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.

I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.

What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.

Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.

Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.

18:45
There are three members of the pre-legislative committee in the Chamber. We were very clear about design features, and several members who are not present were even clearer. So I hear where we are with the Bill, but I have been following it for five years and have been saying the same thing, so if we are a little late to the party I do not think that is because of me. I do not want to delay the Bill but I want to stamp the authority of Parliament on the question of how harm happens, as well as what it is.
My last sentence has to be: let us remember our conversation about trying to measure illegal harm and then think about it at scale for children. We have to have something softer than that; we cannot do it for each piece of content. The saving grace of the Bill is its systems and processes: it will make the tsunami a trickle—that is what we want to do. It is not to say that young people should not have access to the internet. Although I spent quite a lot of time disagreeing with the noble Baroness, Lady Fox, today, I absolutely agree with her about evolving capacities, and I hope that we revisit that question later. With that, I beg leave to withdraw my amendment.
Amendment 20 withdrawn.
Amendment 21 not moved.
Amendment 21A
Moved by
21A: Clause 10, page 10, line 1, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 19 about supplying records of risk assessments to OFCOM.
Amendment 21A agreed.
Clause 10, as amended, agreed.
Clause 11: Safety duties protecting children
Amendment 22 not moved.
Amendment 22A
Moved by
22A: Clause 11, page 10, line 6, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in the terms of service (see the amendment in the Minister’s name inserting new subsection (10A) below) is imposed only on providers of Category 1 services.
Amendment 22A agreed.
House resumed.
House adjourned at 6.48 pm.