(7 months, 1 week ago)
Grand CommitteeMy Lords, I start today with probably the most innocuous of the amendments, which is that Clause 44 should not stand part. Others are more significant, but its purpose, if one can describe it as such, is as a probing clause stand part, to see whether the Minister can explain the real motive and impact of new Section 164A, which is inserted by Clause 44. As the explanatory statement says, it appears to hinder
“data subjects’ right to lodge complaints, and extends the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the Commissioner’s response to a complaint.”
I am looking to the Minister to see whether he can unpack the reasons for that and what the impact is on data subjects’ rights.
More fundamental is Amendment 153, which relates to Clause 45. This provision inserts new Section 165A into the Data Protection Act, according to which the commissioner would have the discretion to refuse to act on a complaint if the complainant did not try to resolve the infringement of their rights with the relevant organisation and at least 45 days have passed since then. The right to an effective remedy constitutes a core element of data protection—most individuals will not pursue cases before a court, because of the lengthy, time- consuming and costly nature of judicial proceedings—and acts as a deterrent against data protection violations, in so far as victims can obtain meaningful redress. Administrative remedies are particularly useful, because they focus on addressing malpractice and obtaining meaningful changes in how personal data is handled in practice.
However, the ICO indicates that in 2021-22 it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Moreover, avenues to challenge ICO inaction are extremely limited. Scrutiny of the information tribunal has been restricted to a purely procedural as opposed to a substantive nature. It was narrowed even further by the Administrative Court decision, which found that the ICO was not obliged to investigate each and every complaint.
Amendment 153 would remove Clause 45. The ICO already enjoys a wide margin of discretion and little accountability for how it handles complaints. In light of its poor performance, it does not seem appropriate to expand the discretion of the new information commission even further. It would also extend the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the commissioner’s response to a complaint. This would allow individuals to promote judicial scrutiny over decisions that have a fundamental impact into how laws are enforced in practice and it would increase the overall accountability of the new information commission.
We have signed Amendment 154, in the name of the noble Baroness, Lady Jones, and I look forward to hearing what she says on that. I apologise for the late tabling of Amendments 154A to 154F, which are all related to Amendments 155 and 175. Clause 47 sets out changes in procedure in the courts, in relation to the right of information of a data subject under the 2018 Act, but there are other issues that need resolving around the jurisdiction of the courts and the Upper Tribunal in data protection cases. That is the reason for tabling these amendments.
The High Court’s judgment in the Delo v ICO case held that part of the reasoning in Killock and Veale about the relative jurisdiction of the courts and tribunals was wrong. The Court of Appeal’s decision in the Delo case underlines concerns, but does not properly address the jurisdictions’ limits in Sections 166 and 167 of the 2018 Act, regarding the distinction between determining procedural failings or the merits of decisions by the ICO. Surely jurisdiction under these sections should be in either the courts or the tribunals, not both. In the view of many, including me, it should be in the tribunals. That is what these amendments seek.
It is clear from these two judgments that there was disagreement on the extent of the jurisdiction of tribunals and courts, notably between Mrs Justice Farbey and Mr Justice Mostyn. The commissioner submitted very different submissions to the Upper Tribunal, the High Court and the Court of Appeal, in relation to the extent and limits of Sections 166 and 167. It is not at all clear what Parliament’s intentions were, when passing the 2018 Act, on the extents and limits of the powers in these sections and whether the appropriate source of redress is a court or tribunal.
This has resulted in jurisdictional confusion. A large number of claims have been brought in either the courts or the tribunals, under either Section 166 or Section 167, and the respective court or tribunal has frequently ruled that the claim should have been made under the other section and it therefore does not have jurisdiction, so that the claim is struck out. The Bill offers a prime opportunity to resolve this issue.
Clause 45(5), which creates new Section 166A, would only blur the lines even more and fortify the reasoning for the claim to be put into the tribunals, rather than the courts. These amendments would give certainty to the courts and tribunals as to their powers and would be much less confusing for litigants in person, most of whom do not have the luxury of paying hundreds of thousands in court fees. This itself is another reason for this to remain in the tribunals, which do not charge fees to issue proceedings.
The proposed new clause inserted by Amendment 287 would require the Secretary of State to exercise powers under Section 190 of the 2018 Act to allow public interest organisations to raise data protection complaints on behalf of individuals generally, without the need to obtain the authorisation of each individual being represented. It would therefore implement Article 80(2) of the GDPR, which provides:
“Member States may provide that any body, organisation or association referred to in paragraph 1 of this Article, independently of a data subject’s mandate, has the right to lodge, in that Member State, a complaint with the supervisory authority which is competent pursuant to Article 77 and to exercise the rights referred to in Articles 78 and 79 if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing”.
The intention behind Article 80(2) is to allow appropriately constituted organisations to bring proceedings concerning infringements of the data protection regulations in the absence of the data subject. That is to ensure that proceedings may be brought in response to an infringement, rather than on the specific facts of an individual’s case. As a result, data subjects are, in theory, offered greater and more effective protection of their rights. Actions under Article 80(2) could address systemic infringements that arise by design, rather than requiring an individual to evidence the breaches and the specific effects to them.
At present, an affected individual—a data subject—is always required to bring a claim or complaint to a supervisory authority. Whether through direct action or under Section 187 of the 2018 Act, a data subject will have to be named and engaged. In practice, a data subject is not always identifiable or willing to bring action to address even the most egregious conduct.
Article 80(2) would fill a gap that Article 80(1) and Section 187 of the Data Protection Act are not intended to fill. Individuals can be unwilling to seek justice, exercise their rights and lodge data protection complaints on their own, either for fear of retaliation from a powerful organisation or because of the stigma that may be associated with the matter where a data protection violation occurred. Even a motivated data subject may be unwilling to take action due to the risks involved. For instance, it would be reasonable for that data subject not to want to become involved in a lengthy, costly legal process that may be disproportionate to the loss suffered or remedy available. This is particularly pressing where the infringement concerns systemic concerns rather than where an individual has suffered material or non-material damage as a result of the infringement.
Civil society organisations have long helped complainants navigate justice systems in seeking remedies in the data protection area, providing a valuable addition to the enactment of UK data protection laws. My Amendment 287 would allow public interest organisations to lodge representative complaints, even without the mandate of data subjects, to encourage the filing of well-argued, strategically important cases with the potential to improve significantly the data subject landscape as a whole. This Bill is the ideal opportunity for the Government to implement fully Article 80(2) of the GDPR from international law and plug a significant gap in the protection of UK citizens’ privacy.
In effect, this is unfinished business from our debates on the 2018 Act, when we made several attempts to persuade the Government of the merits of introducing the rights under Article 80(2). I hope that the Government will think again. These are extremely important rights and are available in many other countries governed by a similar GDPR. I beg to move.
My Lords, as a veteran of the 2018 arguments on Article 80(2), I rise in support of Amendment 287, which would see its implementation.
Understanding and exercising personal data rights is not straightforward. Even when the rights are being infringed, it is rare that an individual data subject has the time, knowledge or ability to make a complaint to the ICO. This is particularly true for vulnerable groups, including children and the elderly, disadvantaged groups and other groups of people, such as domestic abuse survivors or members of the LGBTQ community, who may have specific reasons for not identifying themselves in relation to a complaint. It is a principle in law that a right that cannot be activated is not fully given.
A data subject’s ability to claim protection is constrained by a range of factors, none of which relates to the validity of their complaint or the level of harm experienced. Rather, the vast majority are prevented from making a complaint by a lack of expertise, capacity, time and money; by the fact that they are not aware that they have data rights; or by the fact that they understand neither that their rights have been infringed nor how to make a complaint about them.
I have considerable experience of this. I remind the Committee that I am chair of the 5Rights Foundation, which has raised important and systemic issues of non-compliance with the AADC. It has done this primarily by raising concerns with the ICO, which has then undertaken around 40 investigations based on detailed submissions. However, because the information is not part of a formalised process, the ICO has no obligation to respond to the 5Rights Foundation team, the three-month time limit for complaints does not apply and, even though forensic work by the 5Rights Foundation identified the problem, its team is not consulted or updated on progress or the outcome—all of which would be possible had it submitted the information as a formal complaint. I remind the Committee that in these cases we are talking about complaints involving children.
(7 months, 2 weeks ago)
Grand CommitteeMy Lords, I support Amendment 135 in the name of the noble Lord, Lord Bethell, to which I have added my name. He set out our struggle during the passage of the Online Safety Bill, when we made several attempts to get something along these lines into the Bill. It is worth actually quoting the Minister, Paul Scully, who said at the Dispatch Box in the other place:
“we have made a commitment to explore this … further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill”.—[Official Report, Commons, 12/9/23; col. 806.]
When the Minister responds, perhaps he could update the House on that commitment and explain why the Government decided not to address it in the Bill. Although the Bill proposes a lessening of the protections on the use of personal data for research done by commercial companies, including the development of products and marketing, it does nothing to enable public interest research.
I would like to add to the list that the noble Lord, Lord Bethell, started, because as well as Melanie Dawes, the CEO of Ofcom, so too the United States National Academy of Sciences, the Lancet commission, the UN advisory body on AI, the US Surgeon General, the Broadband Commission and the Australian eSafety Commissioner have all in the last few months called for greater access to independent research.
I ask the noble Viscount to explain the Government’s thinking in detail, and I really do hope that we do not get more “wait and see”, because it does not meet the need. We have already passed online safety legislation that requires evidence, and by denying access to independent researchers, we have a perverse situation in which the regulator has to turn to the companies it is regulating for the evidence to create their codes, which, as the noble Viscount will appreciate, is a formula for the tech companies to control the flow of evidence and unduly temper the intent of the legislation. I wish to make most of my remarks on that subject.
In Ofcom’s consultation on its illegal harms code, the disparity between the harms identified and Ofcom’s proposed code caused deep concern. Volume 4 states the following at paragraph 14.12 in relation to content moderation:
“We are not proposing to recommend some measures which may be effective in reducing risks of harm. This is principally due to currently limited evidence”.
Further reading of volume 4 confirms that the lack of evidence is the given reason for failing to recommend measures across a number of harms. Ofcom has identified harms for which it does not require mitigation. This is not what Parliament intended and spectacularly fails to deliver on the promises made by Ministers. Ofcom can use its information-gathering powers to build evidence on the efficacy required to take a bolder approach to measures but, although that is welcome, it is unsatisfactory for many reasons.
First, given the interconnectedness between privacy, safety, security and competition, regulatory standards cannot be developed in silo. We have a thriving academic community that can work across different risks and identify solutions across different parts of the tech ecosystem.
Secondly, a regulatory framework in which standards are determined exclusively through private dialogue between the regulator and the regulated does not have the necessary transparency and accountability to win public trust.
Thirdly, regulators are overstretched and under-resourced. Our academics stand ready and willing to work in the public interest and in accordance with the highest ethical standards in order to scrutinise and understand the data held so very closely by tech companies, but they need a legal basis to demand access.
Fourthly, if we are to maintain our academic institutions in a post-Brexit world, we need to offer UK academics the same support as those in Europe. Article 40(4) of the European Union’s Digital Services Act requires platforms to
“provide access to data to vetted researchers”
seeking to carry out
“research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35”.
It will be a considerable loss to the UK academic sector if its European colleagues have access to data that it does not.
Fifthly, by insisting on evidence but not creating a critical pathway to secure it, the Government have created a situation in which the lack of evidence could mean that Ofcom’s codes are fixed at what the tech companies tell it is possible in spring 2024, and will always be backward-looking. There is considerable whistleblower evidence revealing measures that the companies could have taken but chose not to.
I have considerable personal experience of this. For example, it was nearly a decade ago that I told Facebook that direct messaging on children’s accounts was dangerous, yet only now are we beginning to see regulation reflecting that blindingly obvious fact. That is nearly a decade in which something could have been done by the company but was not, and of which the regulator will have no evidence.
Finally, as we discussed on day one in Committee, the Government have made it easier for commercial companies to use personal data for research by lowering the bar for the collection of data and expanding the concept of research, further building the asymmetry that has been mentioned in every group of amendments we have debated thus far. It may not be very parliamentary language, but it is crazy to pass legislation and then obstruct its implementation by insisting on evidence that you have made it impossible to gather.
I would be grateful if the Minister could answer the following questions when he responds. Is it the Government’s intention that Ofcom codes be based entirely on the current practice of tech companies and that the regulator can demand only mitigations that exist currently, as evidenced by those companies? Do the Government agree that whistleblowers, NGO experts and evidence from user experience can be taken by regulators as evidence of what could or should be done? What route do the Government advise Ofcom to take to mitigate identified risks for which there are no current measures in place? For example, should Ofcom describe the required outcome and leave it to the companies to determine how they mitigate the risk, should it suggest mitigations that have been developed but not tried—or is the real outcome of the OSA to identify risk and leave that risk in place?
Do the Government accept that EU research done under the auspices of the DSA should be automatically considered as an adequate basis for UK regulators where the concerns overlap with UK law? Will the new measures announced for testing and sandboxing of AI models allow for independent research, in which academics, independent of government or tech, will have access to data? Finally, what measures will the Government take to mitigate the impact on universities of a brain drain of academics to Europe, if we do not provide equivalent legislative support to enable them to access the data required to study online safety and privacy? If the Minister is unable to answer me from the Dispatch Box, perhaps he will agree to write to me and place his letter in the Library for other noble Lords to read.
My Lords, there is little for me to say. The noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, have left no stone unturned in this debate. They introduced this amendment superbly, and I pay tribute to them and to Reset, which was with us all the way through the discussions on online harms at the Joint Committee on the draft Online Safety Bill, advocating for these important provisions.
As the noble Lord, Lord Bethell, said, there is a strong body of opinion out there. Insight from what might be called approved independent researchers would enable policy-making and regulatory innovation to keep pace with emerging trends and threats, which can span individual harms, matters of public safety and even national security. We have seen the kinds of harms taking place in social media, and it is absolutely vital that we understand what is happening under the bonnet of social media. It is crucial in detecting, identifying and understanding the systemic risks of online harms and non-compliance with law.
When we discussed the Online Safety Bill, it was a question of not just content but functionality. That was one of the key things. An awful lot of this research relates to that: how algorithms operate in amplifying content and some of the harms taking place on social media. The noble Lord, Lord Bethell, referred to X closing its API for researchers and Meta’s move to shut CrowdTangle. We are going into reverse, whereas we should be moving forward in a much more positive way. When the Online Safety Bill was discussed, we got the review from Ofcom, but we did not get the backup—the legislative power for Ofcom or the ICO to be able to authorise and accredit researchers to carry out the necessary research.
The Government’s response to date has been extremely disappointing, given the history behind this and the pressure and importance of this issue. This dates from discussions some way back, even before the Joint Committee met and heard the case for this kind of researcher access. This Bill is now the best vehicle by which to introduce a proper regime on access for researchers. As the noble Baroness, Lady Kidron, asked, why, having had ministerial assurances, are we not seeing further progress? Are we just going to wait until Ofcom produces its review, which will be at the tail end of a huge programme of work which it has to carry out in order to implement the Online Safety Act?
(8 months ago)
Grand CommitteeMy Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.
The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.
The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.
New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.
If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.
As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.
Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:
“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]
That is very much my feeling about the clause as well.
I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.
In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.
Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.
Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.
There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.
My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.
The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:
“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]
but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.
The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?
As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?
I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?
On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.
I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like
“a cog in a machine or an Amazon worker with no agency or creativity”.
He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.
(8 months ago)
Grand CommitteeMy Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.
Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.
Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,
“among other things … the interests and fundamental rights and freedoms of data subjects”.
The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.
Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:
“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]
I entirely agree with that.
The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.
The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.
As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.
Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data
“to enter the public domain via a public body”,
or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.
On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.
We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.
On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,
“intra-group transmission of personal data”
in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.
As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.
Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.
There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.
I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.
It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.
The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.
I beg to move.
My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.
On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:
“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]
I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.
My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.
My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.
My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.
I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.
With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.
Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.
I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if
“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,
taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.
In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
(8 months, 1 week ago)
Grand CommitteeMy Lords, I hope this is another lightbulb moment, as the noble Lord, Lord Clement-Jones, suggested. As well as Amendment 10, I will speak to Amendments 35, 147 and 148 in my name and the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones. I thank them both. The purpose of these amendments is to move the Bill away from nibbling around the edges of GDPR in pursuit of post-Brexit opportunities and to actually deliver a post-Brexit opportunity.
These amendments would put the UK on an enhanced path of data sophistication while not challenging equivalence, which we will undoubtedly discuss during the Committee. I echo the voice of the noble Lord, Lord Allan, who at Second Reading expressed deep concern that equivalence was not a question of an arrangement between the Government and the EU but would be a question picked up by data activists taking strategic litigation to the courts.
Data protection as conceived by GDPR and in this Bill is primarily seen as an arrangement between an individual and an entity that processes that data—most often a commercial company. But, as evidenced by the last 20 years, the real power lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data such as medical scans. In short, the value—in all forms, not simply financial—lies in big data.
As the value of data became clear, ideas such as “data is the new oil” and data as currency emerged, alongside the notion of data fiduciaries or data trusts, where you can place your data collectively. One early proponent of such ideas was Jaron Lanier, inventor of virtual reality; I remember discussing it with him more than a decade ago. However, these ideas have not found widespread practical application, possibly because they are normally based around ideas of micropayments as the primary value—and very probably because they rely on data subjects gathering their data, so they are for the boffins.
During the passage of the DPA 2018, one noble Lord counted the number of times the Minister said the words “complex” and “complicated” while referring to the Bill. Data law is complex, and the complicated waterfall of its concepts and provisions eludes most non-experts. That is why I propose the four amendments in this group, which would give UK citizens access to data experts for matters that concern them deeply.
Amendment 10 would define the term “data community”, and Amendment 35 would give a data subject the power to assign their data rights to a data community for specific purposes and for a specific time period. Amendment 147 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 148 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Together, they would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives that would benefit the entire group.
Data communities diverge from previous attempts to create big data for the benefit of users, in that they are not predicated on financial payments and neither does each data subject need to access their own data via the complex rules and often obstructive interactions with individual companies. They put rights holders together with experts who do it on their behalf, by allowing data subjects to assign their rights so that an expert can gather the data and crunch it.
This concept is based on a piece of work done by a colleague of mine at the University of Oxford, Dr Reuben Binns, an associate professor in human-centred computing, in association with the Worker Info Exchange. Since 2016, individual Uber drivers, with help from their trade unions and the WIE, asked Uber for their data that showed their jobs, earnings, movements, waiting times and so on. It took many months of negotiation, conducted via data protection lawyers, as each driver individually asked for successive pieces of information that Uber, at first, resisted giving them and then, after litigation, provided.
After a period of time, a new cohort of drivers was recruited, and it was only when several hundred drivers were poised to ask the same set of questions that a formal arrangement was made between Uber and WIE, so that they could be treated as a single group and all the data would be provided about all the drivers. This practical decision allowed Dr Binns to look at the data en masse. While an individual driver knew what they earned and where they were, what became visible when looking across several hundred drivers is how the algorithm reacted to those who refused a poorly paid job, who was assigned the lucrative airport runs, whether where you started impacted on your daily earnings, whether those who worked short hours were given less lucrative jobs, and so on.
This research project continues after several years and benefits from a bespoke arrangement that could, by means of these amendments, be strengthened and made an industry-wide standard with the involvement of the ICO. If it were routine, it would provide opportunity equally for challenger businesses, community groups and research projects. Imagine if a group of elderly people who spend a lot of time at home were able to use a data community to negotiate cheap group insurance, or imagine a research project where I might assign my data rights for the sole purpose of looking at gender inequality. A data community would allow any group of people to assign their rights, rights that are more powerful together than apart. This is doable—I have explained how it has been done. With these amendments, it would be routinely available, contractual, time-limited and subject to a code of conduct.
As it stands, the Bill is regressive for personal data rights and does not deliver the promised Brexit dividends. But there are great possibilities, without threatening adequacy, that could open markets, support innovation in the UK and make data more available to groups in society that rarely benefit from data law. I beg to move.
My Lords, I think this is a lightbulb moment—it is inspired, and this suite of amendments fits together really well. I entirely agree with the noble Baroness, Lady Kidron, that this is a positive aspect. If the Bill contained these four amendments, I might have to alter my opinion of it—how about that for an incentive?
This is an important subject. It is a positive aspect of data rights. We have not got this right yet in this country. We still have great suspicion about sharing and access to personal data. There is almost a conspiracy theory around the use of data, the use of external contractors in the health service and so on, which is extremely unhelpful. If individuals were able to share their data with a trusted hub—a trusted community—that would make all the difference.
Like the noble Baroness, Lady Kidron, I have come across a number of influences over the years. I think the first time many of us came across the idea of data trusts or data institutions was in the Hall-Pesenti review carried out by Dame Wendy Hall and Jérôme Pesenti in 2017. They made a strong recommendation to the Government that they should start thinking about how to operationalise data trusts. Subsequently, organisations such as the Open Data Institute did some valuable research into how data trusts and data institutions could be used in a variety of ways, including in local government. Then the Ada Lovelace Institute did some very good work on the possible legal basis for data trusts and data institutions. Professor Irene Ng was heavily engaged in setting up what was called the “hub of all things”. I was not quite convinced by how it was going to work legally in terms of data sharing and so on, but in a sense we have now got to that point. I give all credit to the academic whom the noble Baroness mentioned. If he has helped us to get to this point, that is helpful. It is not that complicated, but we need full government backing for the ICO and the instruments that the noble Baroness put in her amendments, including regulatory oversight, because it will not be enough simply to have codes that apply. We have to have regulatory oversight.
(10 months, 1 week ago)
Grand CommitteeMy Lords, I do not actually have much to add to the excellent case that has already been made, but I, too, was at the meeting that the noble Baroness, Lady Jones of Whitchurch, mentioned, and noticed the CMA’s existing relationships.
Quite a lot has been said already, on the first group and just now, about lobbying—not lobbying only in a nasty sense but perhaps about the development of relationships that are simply human. I want to make it very clear that those words do not apply to the CMA specifically—but I have worked with many regulators, both here and abroad, and it starts with a feeling that the regulated, not the regulator, holds the information. It goes on to a feeling that the regulated, not the regulator, has the profound understanding of the limits of what is possible. It then progresses to a working relationship in which the regulator, with its limited resources, starts to weigh up what it can win, rather than what it should demand. That results in communities that have actually won legal protections remaining unprotected. It is a sort of triangulation of purpose, in which the regulator’s primary relationship ends up being geared towards government and industry, rather than towards the community that it is constituted to serve.
In that picture, I feel that the amendments in the name of the noble Baroness, Lady Jones of Whitchurch, make it clear, individually and collectively, that at every stage maximum transparency must be observed, and that the incumbents should be prevented from holding all the cards—including by hiding information from the regulator or from other stakeholders who might benefit from it.
I suggest that the amendments do not solve the problem of lobbying or obfuscation, but they incentivise providing information and they give challengers a little bit more of a chance. I am sure we are going to say again and again in Committee that information is power. It is innovation power, political power and market power. I feel passionately that these are technical, housekeeping amendments rather than ones that require any change of government policy.
My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron, whose speech segues straight into my Amendments 14 and 63. This is all about the asymmetry of information. On the one hand, the amendments from the noble Baroness, Lady Jones, which I strongly support and have signed, are about giving information to challengers, whereas my amendments are about extracting information from SMS undertakings.
Failure to respond to a request for information allows SMS players to benefit from the information asymmetry that exists in all technology markets. Frankly, incumbents know much more about how things work than the regulators. They can delay, obfuscate, claim compliance while not fully complying and so on. By contrast, if they cannot proceed unless they have supplied full information, their incentives are changed. They have an incentive to fully inform, if they get a benefit from doing so. That is why merger control works so well and quickly, as the merger is suspended pending provision of full information and competition authority oversight. We saw that with the Activision Blizzard case, where I was extremely supportive of what the CMA did—in many ways, it played a blinder, as was subsequently shown.
We on these Benches consider that a duty to fully inform is needed in the Bill, which is the reason for our Amendments 14 and 63. They insert a new clause in Chapter 2, which provides for a duty to disclose to the CMA
“a relevant digital activity that may give rise to actual or likely detrimental impact on competition in advance of such digital activity’s implementation or effect”
and a related duty in Chapter 6 ensuring that that undertaking
“has an overriding duty to ensure that all information provided to the CMA is full, accurate and complete”.
Under Amendment 14, any SMS undertaking wishing to rely on it must be required to both fully inform and pre-notify the CMA of any conduct that risks breaching one of the Bill’s objectives in Clause 19. This is similar to the tried-and-tested pre-notification process for mergers and avoids the reality that the SMS player may otherwise simply implement changes and ignore the CMA’s requests. A narrow pre-notification system such as this avoids the risks.
We fully support and have signed the amendments tabled by the noble Baroness, Lady Jones. As techUK says, one of the benefits that wider market participants see from the UK’s pro-competition regime is that the CMA will initiate and design remedies based on the evidence it gathers from SMS firms in the wider market. This is one of the main advantages of the UK’s pro-competition regime over the EU DMA. To achieve this, we need to make consultation rights equal for all parties. Under the Bill currently, firms with SMS status, as the noble Baroness, Lady Harding, said, will have far greater consultation rights than those that are detrimentally affected by their anti-competitive behaviour. As she and the noble Lord, Lord Vaizey, said, there are opportunities for SMS firms to comment at the outset but none for challenger firms, which can comment only at a later public consultation stage.
It is very important that there are clear consultation and evidence-gathering requirements for the CMA, which must ensure that it works fairly with SMS firms, challengers, smaller firms and consumers throughout the process, ensuring that the design of conduct requirements applies to SMS firms and pro-competition interventions consider evidence from all sides, allowing interventions to be targeted and capable of delivering effective outcomes. This kind of engagement will be vital to ensuring that the regime can meet its objectives.
We do not believe that addressing this risk requires removing the flexibility given by the Bill. Instead, we believe that it is essential that third parties are given a high degree of transparency and input on deliberation between the CMA and SMS firms. The CMA must also—and I think this touches on something referred to by the noble Baroness, Lady Jones—allow evidence to be submitted in confidence, as well as engage in wider public consultations where appropriate. We very strongly support the amendments.
On the amendments from the noble Lord, Lord Tyrie, it is a bit of a curate’s egg. I support Amendments 12A and 12B because I can see the sense in them. I do not see that we need to have another way of marking the CMA’s homework, however. I am a great believer that we need greater oversight, and we have amendments later in the Bill for proposals to increase parliamentary oversight of what the CMA is doing. However, marking the CMA’s homework at that stage is only going to be an impediment. It will be for the benefit of the SMS undertakings and not necessarily for those who wish to challenge the power of those undertakings. I am only 50% with the noble Lord, rather than the whole hog.
My Lords, all the SMS has to do is put it through one of its large language models, and hey presto.
I am losing track of the conversation because I thought we were asking for more information for the challenger companies. rather than this debate between the SMS and the regulator. Both of them are, I hope, well resourced, but the challenger companies have somehow been left out of this equation and I feel that we are trying to get them into the equation in an appropriate way.
That is not incompatible. These are two sides of the same coin, which is why they are in this group. I suppose we could have degrouped it.
(1 year, 4 months ago)
Lords ChamberThe benefit of having a period of time between the last day of Report on Wednesday and Third Reading is that that gives the Minister, the Bill team and parliamentary counsel the time to reflect on the kind of power that could be devised. The wording could be devised, and I would have thought that six weeks would be quite adequate for that, perhaps in a general way. After all, this is not a power that is immediately going to be used; it is a general power that could be brought into effect by regulation. Surely it is not beyond the wit to devise something suitable.
My Lords, I will speak to the government Amendments 274B and 274C. I truly welcome a more detailed approach to Ofcom’s duties in relation to media literacy. However, as is my theme today, I raise two frustrations. First, having spent weeks telling us that it is impossible to include harms that go beyond content and opposing amendments on that point, the Government’s media literacy strategy includes a duty to help users to understand the harmful ways in which regulated services may be used. This is in addition to understanding the nature and impact of harmful content. It appears to suggest that it is the users who are guilty of misuse of products and services rather than putting any emphasis on the design or processes that determine how a service is most often used.
I believe that all of us, including children, are participants in creating an online culture and that educating and empowering users of services is essential. However, it should not be a substitute for designing a service that is safe by design and default. To make my point absolutely clear, I recount the findings of researchers who undertook workshops in 28 countries with more than 1,000 children. The researchers were at first surprised to find that, whether in Kigali, São Paulo or Berlin, to an overwhelming extent children identified the same problems online—harmful content, addiction, privacy, lack of privacy and so on. The children’s circumstances were so vastly different—country and town, Africa and the global north et cetera—but when the researchers did further analysis, they realised that the reason why they had such similar experiences was because they were using the same products. The products were more determining of the outcome than anything to do with religion, education, status, age, the family or even the country. The only other factor that loomed large, which I admit that the Government have recognised, was gender. Those were the two most crucial findings. It is an abdication of adult responsibility to place the onus on children to keep themselves safe. The amendment and the Bill, as I keep mentioning, should focus on the role of design, not on how a child uses it.
My second point, which is of a similar nature, is that I am very concerned that a lot of digital literacy—for adults as well as children, but my particular concern is in schools—is provided by the tech companies themselves. Therefore, once again their responsibility, their role in the system and process of what children might find from reward loops, algorithms and so on, is very low down on the agenda. Is it possible at this late stage to consider that Ofcom might have a responsibility to consider the system design as part of its literacy review?
My Lords, this has been a very interesting short debate. Like other noble Lords, I am very pleased that the Government have proposed the new clauses in Amendments 274B and 274C. The noble Baroness, Lady Bull, described absolutely the importance of media literacy, particularly for disabled people and for the vulnerable. This is really important for them. It is important also not to fall into the trap described by the noble Baroness, Lady Kidron, of saying, “You are a child or a vulnerable person. You must acquire media literacy—it’s your obligation; it’s not the obligation of the platforms to design their services appropriately”. I take that point, but it does not mean that media literacy is not extraordinarily important.
However, sadly, I do not believe that the breadth of the Government’s new media literacy amendments is as wide as the original draft Bill. If you look back at the draft Bill, that was a completely new and upgraded set of duties right across the board, replacing Section 11 of the Communications Act and, in a sense, fit for the modern age. The Government have made a media literacy duty which is much narrower. It relates only to regulated services. This is not optimum. We need something broader which puts a bigger and broader duty for the future on to Ofcom.
It is also deficient in two respects. The noble Lord, Lord Knight, will speak to his amendments, but it struck me immediately when looking at that proposed new clause that we were missing all the debate about functionalities and so on that the noble Baroness, Lady Kidron, debated the other day, regarding design, and that we must ensure that media literacy encompasses understanding the underlying functionalities and systems of the platforms that we are talking about.
I know that your Lordships will be very excited to hear that I am going to refer again to the Joint Committee. I know that the Minister has read us from cover to cover, but at paragraph 381 on the draft Bill we said, and it is still evergreen:
“If the Government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm”.
I had a very close look at the clause. I could not see that Ofcom is entitled to set minimum standards. The media literacy provisions sadly are deficient in that respect.
My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.
My Lords, this has been a very interesting debate. Beyond peradventure my noble friend Lord Allan and the noble Viscount, Lord Colville, and the noble Baroness, Lady Fox, have demonstrated powerfully the perils of this clause. “Lawyers’ caution” is one of my noble friend’s messages to take away, as is the complexities in making these judgments. It was interesting when he mentioned the sharing for awareness’s sake of certain forms of content and the judgments that must be taken by platforms. His phrase “If in doubt, take it out” is pretty chilling in free speech terms—I think that will come back to haunt us. As the noble Baroness, Lady Fox, said, the wrong message is being delivered by this clause. It is important to have some element of discretion here and not, as the noble Baroness, Lady Kidron, said, a cliff edge. We need a gentler landing. I very much hope that the Minister will land more gently.
(1 year, 4 months ago)
Lords ChamberMy Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.
I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect
“relations with the government of a country outside the United Kingdom”.
Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.
I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.
My Lords, this has been a consistent theme ever since the Joint Committee’s report. It was reported on by the Delegated Powers and Regulatory Reform Committee, and the Digital and Communications Committee, chaired by the noble Baroness, Lady Stowell, has rightly taken up the issue. Seeing some movement from the Minister, particularly on Clause 29 and specifically in terms of Amendments 134 to 137, is very welcome and consistent with some of the concerns that have been raised by noble Lords.
There are still questions to answer about Amendment 138, which my noble friend has raised. I have also signed the amendments to Clause 38 because I think the timetabling is extremely welcome. However, like other noble Lords, I believe we need to have Amendments 139, 140, 144 and 145 in place, as proposed by the noble Baroness, Lady Stowell of Beeston. The phrase “infinite ping-pong” makes us all sink in gloom, in current circumstances—it is a very powerful phrase. I think the Minister really does have to come back with something better; I hope he will give us that assurance, and that his discussions with the noble Baroness Stowell will bear further fruit.
I may not agree with the noble Lord, Lord Moylan, about the Clause 39 issues, but I am glad he raised issues relating to Clause 159. It is notable that of all the recommendations by the Delegated Powers and Regulatory Reform Committee, the Government accepted four out of five but did not accept the one related to what is now Clause 159. I have deliberately de-grouped the questions of whether Clauses 158 and 159 should stand part of the Bill, so I am going to pose a few questions which I hope, when we get to the second group which contains my clause stand part proposition, the Minister will be able to tell me effortlessly what he is going to do. This will prevent me from putting down further amendments on those clauses, because it seems to me that the Government are being extraordinarily inconsistent in terms of how they are dealing with Clauses 158 and 159 compared with how they have amended Clause 39.
For instance, Clause 158 allows the Secretary of State to issue a direction to Ofcom, where the Secretary of State has reasonable grounds for believing that there is a threat to public health and safety or national security, and they can direct Ofcom to set objectives in how they use their media-literacy powers in Section 11 of the Communications Act for a specific period to address the threat, and make Ofcom issue a public-statement notice. That is rather extraordinary. I will not go into great detail at this stage, and I hope the Minister can avoid me having to make a long speech further down the track, but the Government should not be in a position to be able to direct a media regulator on a matter of content. For instance, the Secretary of State has no powers over Ofcom on the content of broadcast regulation—indeed, they have limited powers to direct over radio spectrum and wires—and there is no provision for parliamentary involvement, although I accept that the Secretary of State must publish reasons for the direction. There is also the general question of whether the threshold is high enough to justify this kind of interference. So Clause 158 is not good news at all. It raises a number of questions which I hope the Minister will start to answer today, and maybe we can avoid a great debate further down the track.
My Lords, I strongly support Amendment 180, tabled by the noble Baroness, Lady Merron. I will also explain why I put forward Amendment 180A. I pay tribute to the noble Baroness, Lady Hayman, who pursued this issue with considerable force through her Question in the House.
There is clearly an omission in the Bill. One of its primary aims is to protect children from harmful online content, and animal cruelty content causes harm to the animals involved and, critically, to the people who view it, especially children. In Committee, in the Question and today, we have referred to the polling commissioned by the RSPCA, which found that 23% of 10 to 18 year-olds had seen animal cruelty on social media sites. I am sure that the numbers have increased since that survey in 2018. A study published in 2017 found—if evidence were needed—that:
“There is emerging evidence that childhood exposure to maltreatment of companion animals is associated with psychopathology in childhood and adulthood.”
The noble Baroness made an extremely good case, and I do not think that I need to add to it. When the Bill went through the Commons, assurances were given by the former Minister, Damian Collins, who acknowledged that the inclusion of animal cruelty content in the Bill deserves further consideration as the Bill progresses through its parliamentary stages. We need to keep up that pressure, and we will be very much supporting the noble Baroness if she asks for the opinion of the House.
Turning to my Amendment 180A, like the noble Baroness, I pay tribute to the Social Media Animal Cruelty Coalition, which is a very large coalition of organisations. We face a global extinction crisis which the UK Government themselves have pledged to reverse. Algorithmic amplification tools and social media recommendation engines have driven an explosive growth in online wildlife trafficking. A National Geographic article from 2020 quoted US wildlife officials describing the dizzying scale of the wildlife trade on social media. The UK’s national wildlife crime units say that cyber-enabled wildlife crime has become their priority focus, since virtually all wildlife cases they now investigate have a cyber component to them, usually involving social media or e-commerce platforms. In a few clicks it is easy to find pages, groups and postings selling wildlife products made from endangered species, such as elephant ivory, rhino horn, pangolin scales and marine turtle shells, as well as big cats, reptiles, birds, primates and insects for the exotic pet trade. This vast, unregulated trade in live animals and their parts is not only illegal but exacerbates the risk of another animal/human spillover event such as the ones that caused Ebola, HIV and the Covid-19 pandemic.
In addition to accepting the animal welfare amendment tabled by the noble Baroness, which I hope they do, the Government should also add offences under the Control of Trade in Endangered Species Regulations 2018 to Schedule 7 to the Bill. This would definitely help limit the role of social media platforms in enabling wildlife trafficking, helping to uphold the UK’s commitments to tackling global wildlife crime.
My Lords, I rise very briefly to support the noble Baroness, Lady Merron, and to make only one point. As someone who has the misfortune of seeing a great deal of upsetting material of all kinds, I have to admit that it sears an image on your mind. I have had the misfortune to see the interaction of animal and human cruelty in the same sequences, again and again. In making the point that there is a harm to humans in witnessing and normalising this kind of material, I offer my support to the noble Baroness.
My Lords, Amendments 180 and 180A seek to require the Secretary of State to conduct a review of existing legislation and how it relates to certain animal welfare offences and, contingent on this review, to make them priority offences under the regulatory framework.
I am grateful for this debate on the important issue of protecting against animal cruelty online, and all of us in this House share the view of the importance of so doing. As the House has discussed previously, this Government are committed to strong animal welfare standards and protections. In this spirit, this Government recognise the psychological harm that animal cruelty content can cause to children online. That is why we tabled an amendment that lists content that depicts real or realistic serious violence or injury against an animal, including by fictional creatures, as priority content that is harmful to children. This was debated on the first day of Report.
In addition, all services will need proactively to tackle illegal animal cruelty content where this amounts to an existing offence such as extreme pornography. User-to-user services will be required swiftly to remove other illegal content that targets an individual victim once made aware of its presence.
The noble Baroness asked about timing. We feel it is important to understand how harm to animals as already captured in the Bill will function before committing to the specific remedy proposed in the amendments.
As discussed in Committee, the Bill’s focus is rightly on ensuring that humans, in particular children, are protected online, which is why we have not listed animal offences in Schedule 7. As many have observed, this Bill cannot fix every problem associated with the internet. While we recognise the psychological harm that can be caused to adults by seeing this type of content, listing animal offences in Schedule 7 is likely to dilute providers’ resources away from protecting humans online, which is the Bill’s main purpose.
However, I understand the importance of taking action on animal mistreatment when committed online, and I am sympathetic to the intention of these amendments. As discussed with the noble Baroness, Defra is confident that the Animal Welfare Act 2006 and its devolved equivalents can successfully bring prosecutions for the commission and action of animal torture when done online in the UK. These Acts do not cover acts of cruelty that take place outside the UK. I know from the discussion we have had in this House that there are real concerns that the Animal Welfare Act 2006 cannot tackle cross-border content, so I wish to make a further commitment today.
The Government have already committed to consider further how the criminal law can best protect individuals from harmful communications, alongside other communications offences, as part of changes made in the other place. To that end, we commit to include the harm caused by animal mistreatment communications as part of this assessment. This will then provide a basis for the Secretary of State to consider whether this offence should be added to Schedule 7 to the OSB via the powers in Clause 198. This work will commence shortly, and I am confident that this, in combination with animal cruelty content listed as priority harms to children, will safeguard users from this type of content online.
For the reasons set out, I hope the noble Baroness and the noble Lord will consider not pressing their amendments.
My Lords, I rise to make a slightly lesser point, but I also welcome these amendments. I want to ask the Minister where the consultation piece of this will lie and to check that all the people who have been in this space for many years will be consulted.
My Lords, as ever, my noble friend Lord Allan and the noble Baroness, Lady Kidron, have made helpful, practical and operational points that I hope the Minister will be able to answer. In fact, the first half of my noble friend’s speech was really a speech that the Minister himself could have given in welcoming the amendment, which we do on these Benches.
(1 year, 4 months ago)
Lords ChamberMy Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.
In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.
This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.
All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.
Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.
My Lords, I agree with the noble Baroness, Lady Kidron, that all these amendments are very much heading in the same direction, and from these Benches I am extremely sympathetic to all of them. It may well be that this is very strongly linked to the categorisation debate, as the noble Baroness, Lady Kidron, said.
The amendment from the noble Lord, Lord Bethell, matters even more when we are talking about pornography in the sense that child safety duties are based on risks. I cannot for the life of me see why we should try to contradict that by adding in capacity and size and so on.
My noble friend made a characteristically thoughtful speech about the need for Ofcom to regulate in the right way and make decisions about risk and the capacity challenges of new entrants and so on. I was very taken by what the noble Baroness, Lady Harding, had to say. This is akin to health and safety and, quite frankly, it is a cultural issue for developers. What after all is safety by design if it is not advance risk assessment of the kinds of algorithm that you are developing for your platform? It is a really important factor.
My Lords, I rise briefly to note that, in the exchange between the noble Lords, Lord Allan and Lord Moylan, there was this idea about where you can complain. The independent complaints mechanism would be as advantageous to people who are concerned about freedom of speech as it would be for any other reason. I join and add my voice to other noble Lords who expressed their support for the noble Baroness, Lady Fox, on Amendment 162 about the Public Order Act.
My Lords, we are dangerously on the same page this evening. I absolutely agree with the noble Baroness, Lady Kidron, about demonstrating the need for an independent complaints mechanism. The noble Baroness, Lady Stowell, captured quite a lot of the need to keep the freedom of expression aspect under close review, as we go through the Bill. The noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, have raised an important and useful debate, and there are some crucial issues here. My noble friend captured it when he talked about the justifiable limitations and the context in which limitations are made. Some of the points made about the Public Order Act offences are extremely valuable.
I turn to one thing that surprised me. It was interesting that the noble Lord, Lord Moylan, quoted the Equality and Human Rights Commission, which said it had reservations about the protection of freedom of expression in the Bill. As we go through the Bill, it is easy to keep our eyes on the ground and not to look too closely at the overall impact. In its briefing, which is pretty comprehensive, paragraph 2.14 says:
“In a few cases, it may be clear that the content breaches the law. However, in most cases decisions about illegality will be complex and far from clear. Guidance from Ofcom could never sufficiently capture the full range or complexity of these offences to support service providers comprehensively in such judgements, which are quasi-judicial”.
I am rather more optimistic than that, but we need further assurance on how that will operate. Its life would probably be easier if we did not have the Public Order Act offences in Schedule 7.
I am interested to hear what the Minister says. I am sure that there are pressures on him, from his own Benches, to look again at these issues to see whether more can be done. The EHRC says:
“Our recommendation is to create a duty to protect freedom of expression to provide an effective counterbalance to the duties”.
The noble Lord, Lord Moylan, cited this. There is a lot of reference in the Bill but not to the Ofcom duties. So this could be a late contender to settle the horses, so to speak.
This is a difficult Bill; we all know that so much nuance is involved. We really hope that there is not too much difficulty in interpretation when it is put into practice through the codes. That kind of clarity is what we are trying to achieve, and, if the Minister can help to deliver that, he will deserve a monument.
(1 year, 4 months ago)
Lords ChamberI also want to support the noble Baroness, Lady Kennedy. The level of abuse to women online and the gendered nature of it has been minimised; the perpetrators have clearly felt immune to the consequences of law enforcement. What worries me a little in this discussion is the idea or conflation that anything said to a woman is an act of violence. I believe that the noble Baroness was being very specific about the sorts of language that could be caught under her suggestions. I understand from what she said that she has been having conversations with the Minister. I very much hope that something is done in this area, and that it is explored more fully, as the noble Baroness, Lady Morgan, said, in the guidance. However, I just want to make the point that online abuse is also gamified: people make arrangements to abuse people in groups in particular ways that are not direct. If they threaten violence, that is quite different to a pile-in saying that you are a marvellous human being.
My Lords, I too must declare my interests on the register—I think that is the quickest way of doing it to save time. We still have time, and I very much hope that the Minister will listen to this debate and consider it. Although we are considering clauses that, by and large, come at the end of the Bill, there is still time procedurally—if the Minister so decides—to come forward with an amendment later on Report or at Third Reading.
We have heard some very convincing arguments today. My noble friend explained that the Minister did not like the DPP solution. I have looked back again at the Law Commission report, and I cannot for the life of me see the distinction between what was proposed for the offence in its report and what is proposed by the Government. There is a cigarette paper, if we are still allowed to use that analogy, between them, but the DPP is recommended—perhaps not on a personal basis, although I do not know quite what distinction is made there by the Law Commission, but certainly the Minister clearly did not like that. My noble friend has come back with some specifics, and I very much hope that the Minister will put on the record that, in those circumstances, there would not be a prosecution. As we heard in Committee, 130 different organisations had strong concerns, and I hope that the Minister will respond to those concerns.
As regards my other noble friend’s amendment, again creatively she has come back with a proposal for including reckless behaviour. The big problem here is that many people believe that, unless you include “reckless” or “consent”, the “for a laugh” defence operates. As the Minister knows, quite expert advice has been had on this subject. I hope the Minister continues his discussions. I very much support my noble friend in this respect. I hope he will respond to her in respect of timing and monitoring—the noble Baroness, Lady Morgan, mentioned the need for the issue to be kept under review—even if at the end of the day he does not respond positively with an amendment.
Everybody believes that we need a change of culture—even the noble Baroness, Lady Fox, clearly recognises that—but the big difference is whether or not we believe that these particular amendments should be made. We very much welcome what the Law Commission proposed and what the Government have put into effect, but the question at the end of day is whether we truly are making illegal online what is illegal offline. That has always been the Government’s test. We must be mindful of that in trying to equate online behaviour with offline behaviour. I do not believe that we are there yet, however much moral leadership we are exhorted to display. I very much take the point of the noble Baroness, Lady Morgan, about the violence against women and girls amendment that the Government are coming forward with. I hope that will have a cultural change impact as well.
As regards the amendments of the noble Baroness, Lady Kennedy, I very much take the point she made, both at Committee and on Report. She was very specific, as the noble Baroness, Lady Kidron, said, and was very clear about the impact, which as men we severely underestimate if we do not listen to what she said. I was slightly surprised that the noble Baroness, Lady Fox, really underestimates the impact of that kind of abuse—particularly that kind of indirect abuse.
I was interested in what the Minister had to say in Committee:
“In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007”.—[Official Report, 22/6/23; col. 424.]
Is that still the Government's position? Has that been explained to the noble Baroness, Lady Kennedy, who I would have thought was pretty expert in the 2007 Act? If she does not agree with the Minister, that is a matter of some concern.
Finally, I agree that we need to consider the points raised at the outset by the noble and learned Lord, Lord Garnier, and I very much hope that the Government will keep that under review.
(1 year, 5 months ago)
Lords ChamberMy Lords, I congratulate the noble Baroness on having elucidated this arcane set of amendments. Unfortunately, though, it makes me deeply suspicious when I see what the amendments seem to do. I am not entirely clear about whether we are returning to some kind of merits-based appeal. If so, since the main litigators are going to be the social media companies, it will operate for their benefit to reopen every single thing that they possibly can on the basis of the original evidence that was taken into account by Ofcom, as opposed to doing it on a JR basis. It makes me feel quite uncomfortable if it is for their benefit, because I suspect it is not going to be for the ordinary user who has been disadvantaged by a social media company. I hope our brand spanking new independent complaints system—which the Minister will no doubt assure us is well on the way—will deal with that, but this strikes me as going a little too far.
My Lords, I enter the fray with some trepidation. In a briefing, Carnegie, which we all love and respect, and which has been fantastic in the background in Committee days, shared some concerns. As I interpret its concerns, when Ofcom was created in 2003 its decisions could be appealed on their merits, as the noble Lord has just suggested, to the Competition Appeal Tribunal, and I believe that this was seen as a balancing measure against an untested regime. What followed was that the broad basis on which appeal was allowed led to Ofcom defending 10 appeals per year, which really frustrated its ability as a regulator to take timely decisions. It turned out that the appeals against Ofcom made up more than 80% of the workload of the Competition Appeal Tribunal, whose work was supposed to cover a whole gamut of matters. When there was a consultation in the fringes of the DEA, it was decided to restrict appeal to judicial review and appeal on process. I just want to make sure that we are not opening up a huge and unnecessary delaying tactic.
My Lords, we already had a long debate on this subject earlier in Committee. In the interim, many noble Lords associated with these amendments have had conversations with the Government, which I hope will bear some fruit before Report. Today, I want to reiterate a few points that I hope are clarifying to the Committee and the department. In the interests of everyone’s evening plans, the noble Lord, Lord Bethell, and the noble Baroness, Lady Harding, wish to associate themselves with these remarks so that they represent us in our entirety.
For many years, we thought age verification was a gold standard, primarily because it involved a specific government-issued piece of information such as a passport. By the same token, we thought age estimation was a lesser beast, given that it is an estimate by its very nature and that the sector primarily relied on self-declarations with very few checks and balances. In recent years, many approaches to age checking have flourished. Some companies provide age assurance tokens based on facial recognition; others use multiple signals of behaviour, friendship group, parental controls and how you move your body in gameplay; and, only yesterday, I saw the very impressive on-device privacy-preserving age-verification system that Apple rolled out in the US two weeks ago. All of these approaches, used individually and cumulatively, have a place in the age-checking ecosystem, and all will become more seamless over time. But we must ensure that, when they are used, they are adequate for the task they are performing and are quality controlled so that they do not share information about a child, are secure and are effective.
That is why, at the heart of the package of measures put forward in my name and that of the noble Lords, Lord Stevenson and Lord Bethell, and the right reverend Prelate the Bishop of Oxford, are two concepts. First, the method of measuring age should be tech neutral so that all roads can be used. Secondly, there must be robust mechanism of measurement of effectiveness so that only effective systems can be used in high-risk situations, particularly those of primary priority harms such as self-harm and pornography, and that such a measurement will be determined by Ofcom, not industry.
From my work over the last decade and from recent discussion with industry, I am certain that any regime of age assurance must be measurable and hold to certain principles. We cannot create a situation where children’s data is loosely held and liberally shared; we cannot have a system that discriminates against, or does not have automatic appeal mechanisms for, children of colour or those who are 17 or 19, who are at most likelihood of error. Systems should aim to be interoperable and private, not leave traces as children go from one service to another.
Each of the principles of our age-verification package set out in the schedule are of crucial importance. I hope that the Government will see the sense in that because, without them, this age checking will not be trusted. Equally, I urge the Committee to embrace the duality of age verification and estimation that the Government have put forward, because, if a child uses an older sibling’s form of verification and a company understands through the child’s behaviour that they are indeed a child, then we do not want to set up a perverse situation in which the verification is considered of a higher order and they cannot take action based on estimation; ditto, if estimation in gameplay is more accurate than tokens that verify whether someone is over or under 18, it may well be that estimation gives greater assurance that the company will treat the child according to their age.
I hope and believe that, in his response, the Minister will confirm that definitions of age assurance and age estimation will be on the face of the Bill. I also urge him to make a generous promise to accept the full gamut of our concerns about age checking and bring forward amendments in his name on Report that reflect them in full. I beg to move.
My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.
(1 year, 5 months ago)
Lords ChamberMy Lords, I will be even more direct than the noble Baroness, Lady Morgan, and seek some confirmation. I understood from our various briefings in Committee that, where content is illegal, it is illegal anywhere in the digital world—it is not restricted simply to user to user, search and Part 5. Can the Minister say whether I have understood that correctly? If I have, will he confirm that Ofcom will be able to use its disruption powers on a service out of scope, as it were, such as a blog or a game with no user-to-user aspect, if it were found to be persistently hosting illegal content?
My Lords, this has been an interesting debate, though one of two halves, if not three.
The noble Lord, Lord Bethell, introduced his amendment in a very measured way. My noble friend Lady Benjamin really regrets that she cannot be here, but she strongly supports it. I will quote her without taking her speech entirely on board, as we have been admonished for that previously. She would have said that
“credit card companies have claimed ignorance using the excuse of how could they be expected to know they are supporting porn if they were not responsible for maintaining porn websites … This is simply not acceptable”.
Noble Lords must forgive me—I could not possibly have delivered that in the way that my noble friend would have done. However, I very much took on board what the noble Lord said about how this makes breaches transparent to the credit card companies. It is a right to be informed, not an enforcement power. The noble Lord described it as a simple and proportionate measure, which I think is fair. I would very much like to hear from the Minister why, given the importance of credit card companies in the provision of pornographic content, this is not acceptable to the Government.
The second part of this group is all about effective enforcement, which the noble Lord, Lord Bethell, spoke to as well. This is quite technical; it is really important that these issues have been raised, in particular by the noble Lord. The question is whether Ofcom has the appropriate enforcement powers. I was very taken by the phrase
“pre-empt a possible legal challenge”,
as it is quite helpful to get your retaliation in first. Underlying all this is that we need to know what advice the Minister and Ofcom are getting about the enforcement powers and so on.
I am slightly more sceptical about the amendments from the noble Lord, Lord Curry. I am all in favour of the need for speed in enforcement, particularly having argued for it in competition cases, where getting ex-ante powers is always a good idea—the faster one can move, the better. However, restricting the discretion of Ofcom in those circumstances seems to me a bit over the top. Many of us have expressed our confidence in Ofcom as we have gone through the Bill. We may come back to this in future; none of us thinks the Bill will necessarily be the perfect instrument, and it may prove that we do not have a sufficiently muscular regulator. I entirely respect the noble Lord’s track record and experience in regulation, but Ofcom has so far given us confidence that it will be a muscular regulator.
I turn now to the third part of the group. I was interested in the context in which my noble friend placed enforcement; it is really important and supported by the noble Baroness, Lady Morgan. It is interesting what questions have been asked about the full extent of the Government’s ambitions in this respect: are VPNs going to be subject to these kinds of notices? I would hope so; if VPNs are really the gateway to some of the unacceptable harms that we are trying to prevent, we should know about that. We should be very cognisant of the kind of possible culture being adopted by some of the social media and regulated services, and we should tailor our response accordingly. I will be interested to hear what the Government have to say on that.
(1 year, 6 months ago)
Lords ChamberMy Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.
My Lords, this has been an interesting short debate and the noble Baroness, Lady Morgan, made a very simple proposition. I am very grateful to her for introducing this so clearly and comprehensively. Of course, it is all about the way that platforms will identify illegal, fraudulent advertising and attempt to align it with other user-to-user content in terms of transparency, reporting, user reporting and user complaints. It is a very straightforward proposition.
First of all, however, we should thank the Government for acceding to what the Joint Committee suggested, which was that fraudulent advertising should be brought within the scope of the Bill. But, as ever, we want more. That is what it is all about and it is a very straightforward proposition which I very much hope the Minister will accede to.
We have heard from around the Committee about the growing problem and I will be very interested to read the report that the noble Baroness, Lady Kidron, was talking about, in terms of the introduction of fraud into children’s lives—that is really important. The noble Baroness, Lady Morgan, mentioned some of the statistics from Clean Up the Internet, Action Fraud and so on, as did the noble Viscount, Lord Colville. And, of course, it is now digital. Some 80% of fraud, as he said, is cyber-enabled, and 23% of all reported frauds are initiated on social media—so this is bang in the area of the Bill.
It has been very interesting to see how some of the trade organisations, the ABI and others, have talked about the impact of fraud, including digital fraud. The ABI said:
“Consumers’ confidence is being eroded by the ongoing proliferation of online financial scams, including those predicated on impersonation of financial service providers and facilitated through online advertising. Both the insurance and long-term savings sectors are impacted by financial scams perpetrated via online paid-for advertisements, which can deprive vulnerable consumers of their life savings and leave deep emotional scars”.
So, this is very much a cross-industry concern and very visible to the insurance industry and no doubt to other sectors as well.
I congratulate the noble Baroness, Lady Morgan, on her chairing of the fraud committee and on the way it came to its conclusions and scrutinised the Bill. Paragraphs 559, 560 and 561 all set out where the Bill needs to be aligned to the other content that it covers. As she described, there are two areas where the Bill can be improved. If they are not cured, they will substantially undermine its ability to tackle online fraud effectively.
This has the backing of Which? As the Minister will notice, it is very much a cross-industry and consumer body set of amendments, supporting transparency reporting and making sure that those platforms with more fraudulent advertising make proportionately larger changes to their systems. That is why there is transparency reporting for all illegal harms that platforms are obliged to prevent. There is no reason why advertising should be exempt. On user reporting and complaints, it is currently unclear whether this applies only to illegal user-generated content and unpaid search content or if it also applies to illegal fraudulent advertisements. At the very least, I hope the Minister will clarify that today.
Elsewhere, the Bill requires platforms to allow users to complain if the platform fails to comply with its duties to protect users from illegal content and with regard to the content-reporting process. I very much hope the Minister will accede to including that as well.
Some very simple requests are being made in this group. I very much hope that the Minister will take them on board.
(1 year, 6 months ago)
Lords ChamberI entirely understand what the noble Baroness is saying, and I know that she feels particularly strongly about these issues given her experiences. The whole Bill is about trying to weigh up different aspects—we are on day 5 now, and this has been very much the tenor of what we are trying to talk about in terms of balance.
I want to reassure the noble Baroness that we did discuss anonymity in relation to the issues that she has put forward. A company should not be able to use anonymity as an excuse not to deal with the situation, and that is slightly different from simply saying, “We throw our hands up on those issues”.
There was a difference between the fact that companies are using anonymity to say, “We don’t know who it is, and therefore we can’t deal with it”, and the idea that they should take action against people who are abusing the system and the terms of service. It is subtle, but it is very meaningful in relation to what the noble Baroness is suggesting.
That is a very fair description. We have tried to emphasise throughout the discussion on the Bill that it is about not just content but how the system and algorithms work in terms of amplification. In page 35 of our report, we try to address some of those issues—it is not central to the point about anonymity, but we certainly talked about the way that messages are driven by the algorithm. Obviously, how that operates in practice and how the Bill as drafted operates is what we are kicking the tyres on at the moment, and the noble Baroness is absolutely right to do that.
The Government’s response was reasonably satisfactory, but this is exactly why this group explores the definition of verification and so on, and tries to set standards for verification, because we believe that there is a gap in all this. I understand that this is not central to the noble Baroness’s case, but—believe me—the discussion of anonymity was one of the most difficult issues that we discussed in the Joint Committee, and you have to fall somewhere in that discussion.
Requiring platforms to allow users to see other users’ verification status is a crucial further pillar to user empowerment, and it provides users with a key piece of information about other users. Being able to see whether an account is verified would empower victims of online abuse or threats—I think this partly answers the noble Baroness’s question—to make more informed judgments about the source of the problem, and therefore take more effective steps to protect themselves. Making verification status visible to all users puts more choice in their hands as to how they manage the higher risks associated with non-verified and anonymous accounts, and offers them a lighter-touch alternative to filtering out all non-verified users entirely.
We on these Benches support the amendments that have been put forward. Amendment 141 aims to ensure that a user verification duty delivers in the way that the public and Government hope it will—by giving Ofcom a clear remit to require that the verification systems that platforms are required to develop in response to the duty are sufficiently rigorous and accessible to all users.
I was taken by what the noble Baroness, Lady Bull, said, particularly the case for Ofcom’s duties as regards those with disabilities. We need Ofcom to be tasked with setting out the principles and minimum standards, because otherwise platforms will try to claim, as verification, systems that do not genuinely verify a user’s identity, are unaffordable to ordinary users or use their data inappropriately.
Likewise, we support Amendment 303, which would introduce a definition of “user identity verification” into the Bill to ensure that we are all on the same page. In Committee in the House of Commons, Ministers suggested that “user identity verification” is an everyday term so does not need a definition. This amendment, which no doubt the noble Baroness, Lady Merron, will speak to in more detail, is bang on point as far as that is concerned. That was not a convincing answer, and that is why this amendment is particularly apt.
I heard what the noble Baroness, Lady Buscombe, had to say, but in many ways the amendment in the previous group in the name of the noble Lord, Lord Knight, met some of the noble Baroness’s concerns. As regards the amendment in the name of the noble Lord, Lord Moylan, we are all Wikipedia fans, so we all want to make sure that there is no barrier to Wikipedia operating successfully. I wonder whether perhaps the noble Lord is making quite a lot out of the Wikipedia experience, but I am sure the Minister will enlighten us all and will have a spot-on response for him.
(1 year, 6 months ago)
Lords ChamberMy Lords, I support something between the amendments of the noble Lords, Lord Stevenson and Lord Bethell, and the Government. I welcome all three and put on record my thanks to the Government for making a move on this issue.
There are three members of the pre-legislative committee still in the Chamber at this late hour, and I am sure I am not the only one of those three who remembers the excruciating detail in which Suzanne Webb MP, during evidence given with Meta’s head of child safety, established that there was nowhere to report harm, but nowhere—not up a bit, not sideways, not to the C-suite. It was stunning. I have used that clip from the committee’s proceedings several times in schools to show what we do in the House of Lords, because it was fascinating. That fact was also made abundantly clear by Frances Haugen. When we asked her why she took the risk of copying things and walking them out, she said, “There was nowhere to go and no one to talk to”.
Turning to the amendments, like the noble Baroness, Lady Harding, I am concerned about whether we have properly dealt with C-suite reporting and accountability, but I am a hugely enthusiastic supporter of that accountability being in the system. I will be interested to hear the Minister speak to the Government’s amendment, but also to some of the other issues raised by the noble Lord, Lord Knight.
I will comment very briefly on the supply chain and Amendment 219. Doing so, I go back again to Amendment 2, debated last week, which sought to add services not covered by the current scope but which clearly promoted and enabled access to harm and which were also likely to be accessed by children. I have a long quote from the Minister but, because of the hour, I will not read it out. In effect, and to paraphrase, he said, “Don’t worry, they will be caught by the other guys—the search and user-to-user platforms”. If the structure of the Bill means that it is mandatory that the user-to-user and search platforms catch the people in the supply chain, surely it would be a great idea to put that in the Bill absolutely explicitly.
Finally, while I share some of the concerns raised by the noble Baroness, Lady Fox, I repeat my constant reprise of “risk not size”. The size of the fine is related to the turnover of the company, so it is actually proportionate.
My Lords, this has been a really interesting debate. I started out thinking that we were developing quite a lot of clarity. The Government have moved quite a long way since we first started debating senior manager liability, but there is still a bit of fog that needs dispelling—the noble Baronesses, Lady Kidron and Lady Harding, have demonstrated that we are not there yet.
I started off by saying yes to this group, before I got to grips with the government amendments. I broadly thought that Amendment 33, tabled by the noble Lord, Lord Stevenson, and Amendment 182, tabled by the noble Lord, Lord Bethell, were heading in the right direction. However, I was stopped short by Trustpilot’s briefing, which talked about a stepped approach regarding breaches and so on—that is a very strong point. It says that it is important to recognise that not all breaches should carry the same weight. In fact, it is even more than that: certain things should not even be an offence, unless you have been persistent or negligent. We have to be quite mindful as to how you formulate criminal offences.
I very much liked what the noble Lord, Lord Bethell, had to say about the tech view of its own liability. We have all seen articles about tech exceptionalism, and, for some reason, that seems to have taken quite a hold—so we have to dispel that as well. That is why I very much liked what the noble Lord, Lord Curry, said. It seemed to me that that was very much part of a stepped approach, while also being transparent to the object of the exercise and the company involved. That fits very well with the architecture of the Bill.
The noble Baroness, Lady Harding, put her finger on it: the Bill is not absolutely clear. In the Government’s response to the Joint Committee’s report, we were promised that, within three to six months, we would get that senior manager liability. On reading the Bill, I am certainly still a bit foggy about it, and it is quite reassuring that the noble Baroness, Lady Harding, is foggy about it too. Is that senior manager liability definitely there? Will it be there?
The Joint Committee made two other recommendations which I thought made a lot of sense: the obligation to report on risk assessment to the main board of a company, and the appointment of a safety controller, which the noble Lord, Lord Knight, mentioned. Such a controller would make it very clear—as with GDPR, you would have a senior manager who you can fix the duty on.
Like the noble Baroness, Lady Harding, I would very much like to hear from the Minister on the question of personal liability, as well as about Ofcom. It is important that any criminal prosecution is mediated by Ofcom; that is cardinal. You cannot just create criminal offences where you can have a prosecution without the intervention of Ofcom. That is extraordinarily important.
I have just a couple of final points. The noble Baroness, Lady Fox, comes back quite often to this point about regulation being the enemy of innovation. It very much depends what kind of innovation we are talking about. Technology is not necessarily neutral. It depends how the humans who deploy it operate it. In circumstances such as this, where we are talking about children and about smaller platforms that can do harm, I have no qualms about having regulation or indeed criminal liability. That is a really important factor. We are talking about a really important area.
I very strongly support Amendment 219. It deals with a really important aspect which is completely missing from the Bill. I have a splendid briefing here, which I am not going to read out, but it is all about Mastodon being one example of a new style of federated platform in which the app or hub for a network may be category 1 owing to the size of its user base but individual subdomains or networks sitting below it could fall under category 2 status. I am very happy to give a copy of the briefing to the Minister; it is a really well-written brief, and demonstrates entirely some of the issues we are talking about here.
I reassure the noble Lord, Lord Knight, that I think the amendment is very well drafted. It is really quite cunning in the way that it is done.
(1 year, 7 months ago)
Lords ChamberI also welcome these amendments, but I have two very brief questions for the Minister. First, in Amendment 27A, it seems that the child risk assessment is limited only to category 1 services and will be published only in the terms of service. As he probably knows, 98% of people do not read terms of service, so I wondered where else we might find this, or whether there is a better way of dealing with it.
My second question is to do with Amendments 64A and 88A. It seems to me—forgive me if I am wrong—that the Bill previously stipulated that all regulated search and user services had to make and keep a written record of any measure taken in compliance with a relevant duty, but now it seems to have rowed back to only category 1 and 2A services. I may be wrong on that, but I would like to check it for the record.
My Lords, the noble Baroness, Lady Kidron, put her finger exactly on the two questions that I wanted to ask: namely, why only category 1 and category 2A, and is there some rowing back involved here? Of course, none of this prejudices the fact that, when we come later in Committee to talk about widening the ambit of risk assessments to material other than that which is specified in the Bill, this kind of transparency would be extremely useful. But the rationale for why it is only category 1 and category 2A in particular would be very useful to hear.
(3 years, 10 months ago)
Lords ChamberMy Lords, I shall speak to Amendment 23 in my name and those of the noble Lords, Lord Stevenson of Balmacara, Lord Clement-Jones and Lord Sheikh. This amendment represents the wishes of many colleagues from all sides of the house, and with that in mind I have informed the clerk that we intend to divide the House. I refer noble Lords to my interests in the register, particularly that as chair of the 5Rights Foundation, a charity that works to build the digital world that children deserve.
The amendment has been slightly revised since it was tabled in Committee, to reflect comments made then, but its purpose remains resolutely the same: to ensure that the online safety of children and other vulnerable users is not compromised as a consequence of clauses that appear in future free trade agreements.
Like many colleagues, I would rather that the UK Parliament had, as the US Congress does, a system of parliamentary scrutiny of all aspects of trade deals, but that is not the case. The amendment would offer significant protections for UK children online by protecting UK domestic law, widely regarded as the best in the world, as far as it affects children’s online safety. It would sit after Clause 2 and would therefore pertain to all future UK trade deals.
Proposed new subsection (2)(a) would capture existing UK legislation and treaties. This would allow the Government to cite existing treaties, such as the Convention on the Rights of the Child, which the UK has ratified but the US has not, or domestic legislation that already offers protections for children online. It would also capture any further advances made in UK law between now and the time that any trade agreement is settled.
Proposed new subsection (2)(b) specifically refers to data protections brought into law on 2 September last year in the form of the age-appropriate design code, which will have a profound impact on children’s online safety. That code was an initiative introduced and won in this House by a similar all-party grouping, with support from all sides of the House. It would also ensure that the Data Protection Act 2018 was protected in total, since many of the provisions of the children’s code build on the broader provisions of the DPA.
Proposed new subsection (2)(c) would give the Secretary of State the power to carve out from a trade deal any new or related legislation—for example, the upcoming online harms Bill, or any provisions put forward as the result of inquiries by the Competition and Markets Authority, the Law Commission, Ofcom, the ICO and so on. Digital regulation is a fast-moving area of policy, and the discretion given to the Secretary of State by subsection (2)(c) would ensure his or her ability to reflect the latest commitments on children’s online protection in FTAs.
The amendment would also define children as any person under 18. This is crucial, since the US domestic consumer law, COPPA, has created a de facto age of adulthood online of 13, in the face of all tradition and decades of evidence of child development. Using 13 as a false marker of adulthood has been thoughtlessly mirrored around the world. It fails to offer any protection to those aged 13 to 17, who require protections and freedoms in line with their evolving maturity but are clearly not yet adults.
I am very grateful to both the Minister and the Minister of State for Trade Policy, Greg Hands MP, for taking the time to speak to me since I first tabled this amendment. I am sympathetic to their overall position that the Bill should not tie the hands of UK trade negotiators, but in this case it is imperative that we do so, because some things are simply not for sale.
In the very few weeks since we debated this amendment in Committee, we have seen that the protections outlined in the amendment are entirely absent in the EU-UK deal, and in the same few weeks we have seen suggestions for the inclusion of provisions in the proposed mini-deal with the US that could completely undermine all the advances that we have made to protect children. That is even before we get to a full-blown US-UK FTA. In this context, Ministers can no longer cast doubt on the relevance of the amendment, nor can they suggest that this is an issue that can be dealt with at some indeterminate time in the future. We have set our sights on being a sovereign trading nation and are seeking to do that in short speed. We must make sure from the very beginning that we do not trade away the safety and security of our children.
In closing, I point to the Government’s recent online harms response and say to the Minister, whom I know to be personally committed to the safety of children, that it is simply impossible to balance the promises made to parents and children in the context of the online harms Bill without us also determinedly protecting the advances and commitments that we already have made. Amendment 23 would ensure that the UK domestic attitudes, legislation and guidance that protect children’s safety online could not be traded away. In a trade deal, no one side ever gets everything that it wants. We have to take kids off the table. I beg to move.
My Lords, it is a privilege to follow the noble Baroness, Lady Kidron, and her extremely cogent introduction. I have signed Amendment 23, which we on these Benches strongly support. I pay tribute to her consistent campaigning efforts in the area of online child safety and child protection. Very briefly, I will add why we need this amendment, through some recent media headlines which illustrate the issues involved.
First, on the extent of online harms, here are just a few headlines:
“Social media stalking on rise as harassers dodge identity checks”,
“QAnon is still spreading on Facebook, despite a ban”,
“Facebook’s algorithm a major threat to public health”
and
“Tech companies continue to provide online infrastructure for contentious Covid-19 websites even after flagging them as fake news, finds new Oxford study”.
Many of these online harms impact heavily on children and other vulnerable groups.
Secondly, here are two headlines on the power of big tech:
“Google told its scientists to ‘strike a positive tone’ in AI research documents”
and
“Facebook says it may quit Europe over ban on sharing data with US”.
There can be no doubting the sheer global lobbying power of the major platforms and their ability to influence governments.
Thirdly, on the opportunity for change and to retain our laws, the headlines included
“New ‘transformational’ code to protect children’s privacy online”,
which refers to the age-appropriate design code that has now been renamed “the children’s code”, and
“Britain can lead the world in reining in the tech giants if we get the details right”,
which refers to the proposals to introduce a new online duty of care.
“CMA advises government on new regulatory regime for tech giants”
refers to the new digital markets unit, and the CMA is referred to again in:
“Google told to stamp out fraudulent advertising”.
We have started down a crucial road of regulating the behaviour of the big tech companies and preventing harm, particularly to our children and the vulnerable. In any trade deal we want to preserve the protections that our citizens have, and all those that are coming into place, and we do not want to water them down in any way as a result of any trade negotiation.
The trade deal that looms largest is of course with the US, and there are indications that with the new Administration, which so many of us welcome, there will be new attitudes towards privacy rights, especially now that it seems that Congress will have Democrat majority control. I hope that they will vigorously pursue the antitrust cases that have been started, but we have no guarantee that they will go further, for instance in successfully eliminating the all-important safe harbour legal shield for internet companies, Section 230 of the Communications Decency Act. There is no guarantee that this will go, or that there will not be attempts to enforce this by the US in its future trade deals.
The Minister, the noble Lord, Lord Grimstone, for whom I have the greatest respect, will no doubt say that the Government will have red lines in their negotiations and that there is no way that they will countenance negotiating away the online protections which we currently have. But, as we have seen with the withdrawal agreement, Northern Ireland, the fishing industry and the UK-EU Trade and Co-operation Agreement, these can be washed away, or blurred, as data protection is in the agreement with Japan. So there is a great degree of uncertainty on both sides of the Atlantic. For that reason, without doubting any assurance that the Minister gives, this amendment is essential, and on these Benches we will strongly support it if the noble Baroness, Lady Kidron, takes it to a vote.