Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Business and Trade
(2 months ago)
Grand CommitteeMy Lords, I will speak to Amendments 59, 62, 63 and 65 in the name of my noble friend Lord Colville, and Amendment 64 in the name of the noble Lord, Lord Clement-Jones, to which I added my name. I am also very much in sympathy with the other amendments in this group more broadly.
My noble friend Lord Colville set out how he is seeking to understand what the Government intend by “scientific research” and to make sure that the Bill does not offer a loophole so big that any commercial company can avoid data protections of UK citizens in the name of science.
At Second Reading, I read out a dictionary definition of science:
“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”—
i.e. everything. I also ask the Minister if the following scenarios could reasonably be considered scientific. Is updating or improving a new tracking app for fitness, or a bot for an airline, scientific? Is the behavioural science of testing children’s response to persuasive design strategies in order to extend the stickiness of commercial products scientific? These are practical scenarios, and I would be grateful for an answer in order to understand what is in and out of the scope of the Bill.
When I raised Clause 67 at a briefing meeting, it was said that it was, as my noble friend Lord Colville suggested, just housekeeping. The law firm Taylor Wessing suggests that what can
“‘reasonably be described as scientific’ is arguably very wide and fairly vague, so it will be interesting to see how this is interpreted, but the assumption is that it is intended to be a very broad definition”.
Each of the 14 law firm blogs and briefings that I read over the weekend described it variously as loosening, expanding or broadening. Not one suggested that it was a tightening and not one said that it was a no-change change. As we have heard, the European Data Protection Supervisor published an opinion stating that
“scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.
When the Minister responds, perhaps she could say whether the particular scenarios I have set out fall within the definition of scientific and why the Government have failed to reflect the critical clarification of the European Data Protection Supervisor in transferring the recital into the Bill.
I turn briefly to Amendment 64, which would limit the use of children’s personal data for the purposes of research and education by making it subject to a public interest requirement and opt-in from the child or a parent. I will speak in our debate on a later grouping to amendments that would enshrine children’s right to higher protection and propose a comprehensive code of practice on the use of children’s data in education, which is an issue of increasing scandal and concern. For now, it would be good to understand whether the Government agree that education is an area of research where a public interest requirement is necessary and appropriate and that children’s data should always be used to support their right to learn, rather than to commoditise them.
During debate on the DPDI Bill, a code of practice on children’s data and scientific research was proposed; the Minister added her name to it. It is by accident rather than by design that I have failed to lay it here, but I will listen carefully to the Minister’s reply to see whether children need additional protections from scientific research as the Government now define it.
My Lords, I have in subsequent groups a number of amendments that touch on many of the issues that are raised here, so I will not detain the Committee by going through them at this stage and repeating them later. However, I feel that, although the Government have had the best intentions in bringing forward a set of proposals in this area that were to update and to bring together rather conflicting and difficult pieces of legislation that have been left because of the Brexit arrangements, they have managed to open up a gap between where we want to be and where we will be if the Bill goes forward in its present form. I say that in relation to AI, which is a subject requiring a lot more attention and a lot more detail than we have before us. I doubt very much whether the Government will have the appetite for dealing with that in time for this Bill, but I hope that at the very least—it would be a minor concession at this stage—they will commit at the Dispatch Box to seeking to resolve these issues in the legislation within a very short period because, as we have heard from the arguments made today, it is desperately needed.
More importantly, if, by bringing together documentation that is thought to represent the current situation, either inadvertently or otherwise, the Government have managed to open up a loophole that will devalue the way in which we currently treat personal data—I will come on to this when I get to my groups in relation to the NHS in particular—that would be a grievous situation. I hope that, going forward, the points that have been made here can be accommodated in a statement that will resolve them, because they need to be resolved.
My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.
We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.
Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.
I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.
Perhaps I did not make myself clear. I was saying that the defence always goes to space or to medicine, and we are trying to ascertain the product development that is not textiles, and so on. I have two positions in two different universities; they are marvellous places; research is very important.
I am glad we are on the same page on all that.
I now turn to the specifics of the amendments. I thank the noble Lords, Lord Freyberg and Lord Holmes, and the noble Viscount, Lord Camrose, for their amendments, and the noble Lord, Lord Lucas, for his contribution. As I said in the previous debate, I can reassure all noble Lords that if an area of research does not count as scientific research at the moment, it will not under the Bill. These provisions do not expand the meaning of scientific research. If noble Lords still feel unsure about that, I am happy to offer a technical briefing to those who are interested in this issue to clarify that as far as possible.
Moreover, the Bill’s requirement for a reasonableness test will help limit the misuse of this definition more than the current UK GDPR, which says that scientific research should be interpreted broadly. We are tightening up the regulations. This is best assessed on a case-by- case basis, along with the ICO guidance, rather than automatically disqualifying or passing into our activity sectors by approval.
Scientific research that is privately funded or conducted by commercial organisations can also have a life-changing impact. The noble Lord, Lord Markham, was talking earlier about health; issues such as the development of Covid vaccines are just one example of this. It was commercial research that was absolutely life-saving, at the end of the day.
My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.
In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.
The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.
I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.
So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.
We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.
My Lords, I support Amendment 71 and others in this group from the noble Lords, Lord Clement-Jones and Lord Stevenson. I apologise for not being able to speak at Second Reading. The noble Lord, Lord Clement-Jones, will remember that we took a deep interest in this issue when I was a Health Minister and the conversations that we had.
I had a concern at the time. We all know that the NHS needs to be digitised and that relevant health professionals need to be able to access relevant data when they need to, so that there is no need to be stuck with one doctor when you go to another part of the country. There are so many efficiencies that we could have in the system, as long as they are accessed by relevant and appropriate health professionals at the right time. But it is also important that patients have confidence in the system and that their personal data cannot be shared with commercial organisations without them knowing. As other noble Lords have said, this is an issue of trust.
For that reason, when I was in that position, I reached out to civil liberties organisations to understand their concerns. For example, medConfidential was very helpful and had conversations with DHSC and NHS officials. In fact, after those conversations, officials told me that its demands were reasonable and that some of the things being asked for were not that difficult to give and common sense.
I asked a Written Question of the noble Baroness’s ministerial colleague, the noble Baroness, Lady Merron, about whether patients will be informed of who has had access to their patient record, because that is important for confidence. The Answer I got back was that the Government were proposing a single unified health record. We all know that. She said that:
“Ensuring that patients’ confidential information remains protected and is seen only by those who need to see it will be a priority. Public engagement next month will help us understand what safeguards patients would want to see”.
Surely the fact that patients have opted out shows that they already have concerns and have raised them.
The NHS can build the best data system—or the federated data platform, as it is called—but without patient confidence it is simply a castle made of sand. As one of my heroes, Jimi Hendrix, once said, castles made of sand fall into the sea eventually. We do not want to see that with the federated data platform. We want to see a modernised system of healthcare digital records, allowing joined-up thinking on health and care right across a patient’s life. We should be able to use machine learning to analyse those valuable datasets to improve preventive care. But, for that to happen, the key has to be trust and patients being confident that their data is secure and used in the appropriate way. I look forward to the Minister’s response.
I cannot compete with that tour de force. I shall speak to Amendments 73 and 75 in the name of noble Lord, Lord Clement-Jones, to which I have added my name, Amendments 76, 83 and 90 on the Secretary of State’s powers and Amendments 85 and 86 to which I wish I had added my name, but it is hard to keep up with the noble Lord. I am in sympathy with the other amendments in the group.
The issue of recognised legitimate interest has made a frequent appearance in the many briefings I have received and despite reading the Explanatory Notes for the Bill several times, I have struggled to understand in plain English the Government’s intent and purpose. I went to the ICO website to remind myself of the definition of legitimate interest to try to understand why recognised legitimate interest was necessary. It states:
“Legitimate interests is the most flexible lawful basis for processing, but you cannot assume it will always be the most appropriate.”
and then goes on:
“If you choose to rely on legitimate interests, you are taking on extra responsibility for considering and protecting people’s rights and interests.”
That seems to strike a balance between compelling justifications for processing and the need to consider and protect individual data rights and interests. I would be very interested to hear from Minister why the new category of “recognised legitimate interest” is necessary. Specifically, why do the Government believe that when processing may have far-reaching consequences, such as national security, crime prevention and safeguarding, there is no need to undertake a legitimate interest assessment? What is the justification for the ability of any public body to demand that data from private companies for any purpose? I ask those questions to be precise about the context and purpose.
I am not suggesting that there is no legitimate interest for processing personal data without consent, but the legitimate interest assessment is a check and balance that ensures oversight and reduces the risk of overreach. It is a test, not a blocker, and does not in itself prevent processing if the balancing test determines that processing should go ahead. Amendment 85 illustrates this point in relation to vulnerable users. Given that a determination that a person is at risk would have far-reaching consequences for that person, the principles of fairness and accountability demand that those making the decision must follow a due process and that those subject to the decision are aware—if not in an emergency, certainly at some point in the proceedings.
In laying Amendment 86, the noble Lord, Lord Clement-Jones, raises an important question that I am keen to hear from Ministers on, namely, what is the Government’s plan for ensuring that a designation that an individual is vulnerable is monitored and removed when it is no longer appropriate? If a company or organisation has a legitimate interest in processing someone’s data considering the balancing interests of data subjects, it is free to do so. I ask the Minister again to give concrete examples of circumstances in which the current legitimate interest basis is insufficient, so that we understand the problem the Government are trying to solve.
At Second Reading, the Government’s curious defence of this new measure was the idea that organisations had concerns about whether they were doing the balancing test correctly, so the new measure is there to help, but perhaps the Minister can explain what benefits accrue from introducing the new measure that could not have been better achieved by the ICO providing more concrete guidance on the balancing test. Given that the measure is focused on the provision of public interest areas, such as national security and the detection of crime, how does the creation of the recognised legitimate interest help the majority of data controllers, rather than simply serving the interests of incumbents and/or government departments by removing an important check or balance?
Amendments 76, 83 and 90 seek to curb the power of the Secretary of State to override primary legislation and to modify key aspects of UK data protection law via statutory instrument. The proposed provisions in Clauses 70, 71 and 74 put one person in control, rather than Parliament. Elon Musk’s new role in the upcoming US Administration gives him legitimacy as an incoming officeholder in the Executive, but his new role is complicated by the fact that he is also CEO and majority shareholder of X. Like OpenAI, Google, Amazon, Palantir or any other tech behemoth, tech execs are not elected or bound to fulfil social goods or commitments, other than making a profit for their shareholders. They also fund many of the think tanks, reports and events in the political ecosystem, and there is a well-worn path of employment between industry, government and regulators.
No single person should be the carrier of that incredible burden. For now, Parliament is the only barrier in the increasingly confused picture of regulatory and political capture by the tech sector. We should fight to keep it that way.
My Lords, I support Amendment 74 from the noble Lords, Lord Scriven and Lord Clement-Jones, on excluding personal health data from being a recognised legitimate interest. I also support Amendment 78 on having a statement by the Secretary of State to recognise that legitimate interest and Amendments 83 and 90, which would remove powers from the Secretary of State to override primary legislation to modify data protection via an SI. There is not much to add to what I said on the previous group, so I will not repeat all the arguments made then. In simple terms, I repeat the necessity for trust—in health, particularly for patient trust. You do not gain trust simply by defining personal health data as a legitimate interest or by overriding primary legislation on the say-so of a Secretary of State, even if it is laid as a statutory instrument.
My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.
Additional safeguards are required for the protection of children’s data. This amendment
“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.
The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.
For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.
There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.
Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.
The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:
“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.
As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.
During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.
The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.
My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.
Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.
This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.
First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.
In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.
Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.
Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.
I thank all noble Lords who have raised this important topic. I say at the outset that I appreciate and pay tribute to those who have worked on this for many years—in particular the noble Baroness, Lady Kidron, who has been a fantastic champion of these issues.
I also reassure noble Lords that these provisions are intended to build upon, and certainly not to undermine, the rights of children as they have previously been defined. We share noble Lords’ commitment to ensuring high standards of protection for children. That is why I am glad that the Bill, together with existing data protection principles, already provides robust protections for children. I hope that my response to these amendments shows that we take these issues seriously. The ICO also recognises in its guidance, after the UN Committee on the Rights of the Child, that the duties and responsibilities to respect the rights of children extend in practice to private actors and business enterprises.
Amendment 82, moved by the noble Lord, Lord Clement-Jones, would exclude children’s personal data from the exemptions to the purpose limitation principles in Schedule 5 to the Bill. The new purposes are for important public interests only, such as safeguarding vulnerable individuals or children. Broader existing safeguards in the data protection framework, such as the fairness and lawfulness principles, also apply. Prohibiting a change of purpose in processing could impede important activities, such as the safeguarding issues to which I have referred.
Amendment 88, tabled by the noble Baroness, Lady Kidron, would introduce a new duty requiring all data controllers to consider that children are entitled to higher protection than adults. We understand the noble Baroness’s intentions and, in many ways, share her aims, but we would prefer to focus on improving compliance with the current legislation, including through the way the ICO discharges its regulatory functions.
In addition, the proposed duty could have some unwelcome and unintended effects. For example, it could lead to questions about why other vulnerable people are not entitled to enhanced protections. It would also apply to organisations of all sizes, including micro-businesses and voluntary sector organisations, even if they process children’s data on only a small scale. It could also cause confusion about what they would need to do to verify age to comply with the new duty.
Amendment 94, also tabled by the noble Baroness, would ensure that the new notification exemptions under Article 13 would not apply to children. However, removing children’s data from this exemption could mean that some important research—for example, on the causes of childhood diseases—could not be undertaken if the data controller were unable to contact the individuals about the intended processing activity.
Amendment 135 would place new duties on the ICO to uphold the rights of children. The ICO’s new strategic framework, introduced by the Bill, has been carefully structured to achieve a similar effect. Its principal objective requires the regulator to
“secure an appropriate level of protection for personal data”.
This gives flexibility and nuance in the appropriateness of the level of protections; they are not always the same for all data subjects, all the time.
Going beyond this, though, the strategic framework includes the new duty relating to children. This acknowledges that, as the noble Baroness, Lady Kidron, said, children may be less aware of the risks and consequences associated with the processing of their data, as well of as their rights. As she pointed out, this is drawn from recital 38 to the UK GDPR, but the Government’s view is that the Bill’s language gives sufficient effect to the recital. We recognise the importance of clarity on this issue and hope that we have achieved it but, obviously, we are happy to talk further to the noble Baroness on this matter.
This duty will also be a consideration for the ICO and one to which the commissioner must have regard across all data protection activities, where relevant. It will inform the regulator’s thinking on everything from enforcement to guidance, including how work might need to be tailored to suit children at all stages of childhood in order to ensure that the levels of protection are appropriate.
Finally, regarding Amendment 196—
I thank the Minister for giving way. I would like her to explain why only half of the recital is in the Bill and why the fact that children merit special attention is in the Bill. How can it possibly be that, in this Bill, we are giving children adequate protection? I can disagree with some of the other things that she said, but I would like her to answer that specific question.
To be on the safe side, I will write to the noble Baroness. We feel that other bits in the provisions of the Bill cover the other aspects but, just to be clear on it, I will write to her. On Amendment 196 and the Online Safety Act—
My Lords, although it is a late hour, I want to make two or three points. I hope that I will be able to finish what I wish to say relatively quickly. It is important that in looking at the whole of this Bill we keep in mind two things. One is equivalence, and the other is the importance of the rights in the Bill and its protections being anchored in something ordinary people can understand. Unfortunately, I could not be here on the first day but having sat through most of today, I deeply worry about the unintelligibility of this whole legislative package. We are stuck with it for now, but I sincerely hope that this is the last Civil Service-produced Bill of this kind. We need radical new thinking, and I shall try to explore that when we look at automated decision-making—again, a bit that is far too complicated.
Amendment 87 specifically relates to equivalence, and I want to touch on Amendment 125. There is in what I intend to suggest a fix to the problem, if it really exists, that will also have the benefit of underpinning this legislation by rights that people understand and that are applicable not merely to the state but to private companies. The problem that seems to have arisen—there are byproducts of Brexit that from time to time surface—is the whole history of the way in which we left the European Community. We left initially under the withdrawal Act, leaving retained EU law. No doubt many of us remember the debates that took place. The then Government were wholly opposed to keeping the charter. In respect of the protection of people’s data being processed, that is probably acceptable on the basis that the rights of the charter had merged into ordinary retained EU law through the decisions of the Court of Justice of the European Union. All was relatively well until the retained Retained EU Law (Revocation and Reform) Act, which deleted most general EU retained law principles, including fundamental rights, from the UK statute book. What then happened, as I understand it, was that a fix to this problem was attempted by the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023, which tidied up the UK GDPR by making clear that any references to fundamental rights and freedoms were regarded as reference to convention rights within the meaning of the Human Rights Act.
For good and understandable reasons, the Human Rights Act applies to public authorities and in very limited circumstances to private bodies but not as a whole. That is accepted generally and certainly is accepted in the human rights memorandum in respect of this Bill. The difficulty with the Bill, therefore, is that the protections under the Human Rights Act apply only to public authorities but not to private authorities. Whereas, generally speaking, the way in which the Charter of Fundamental Rights operated was to protect, also on a horizontal basis, the processing or use of data by private companies.
This seems to cause two problems. First, it is critical that there is no doubt about this, and I look forward to hearing what the Minister has to say as to the view of the Government’s legal advisers as to whether there is a doubt. Secondly, the amendment goes to the second of the two objectives which we are trying to achieve, which is to instil an understanding of the principles so that the ordinary member of the public can have trust. I defy anyone, even the experts who drafted this, to think that this is intelligible to any ordinary human being. It is simply not. I am sorry to be so rude about it, but this is the epitome of legislation that is, because of its sheer complexity, impossible to understand.
Of course, it could be made a lot better by a short series of principles introduced in the Bill, the kind of thing we have been talking about at times today, with a short, introductory summary of what the rights are under the Bill. I hope consideration can be given to that, but that is not the purpose of my amendment. One purpose that I suggest as a fix to this—to both the point of dealing with rights in a way that people can understand and the point on equivalence—is a very simple application, for the purposes of data processing, of the rights and remedies under the Human Rights Act, extending it to private bodies. One could therefore properly point, in going through the way that the Bill operates, to fundamental rights that people understand which are applicable, not merely if a public authority is processing the data but to the processing of data by private bodies. That is what I wanted to say about Amendment 87.
I wanted to add a word of support, because it is closely allied to this on the equivalence point, to the amendment in the name of the noble Lord, Lord Clement-Jones, for whose support I am grateful in respect of Amendment 87. That relates to the need to have a thorough review of equivalence. Obviously, negotiations will take place, but it really is important that thorough attention is given to the adequacy of our legislation to ensure that there is no incompatibility with the EU regime so we do not get adequacy. Those are the two amendments to which I wished to speak in this group. There are two reasons why I feel it would be wrong for me to go on and deal with the others. Some are very narrow and some very broad, and it is probably easiest to listen to those who are speaking to those amendments in due course. On that basis, therefore, I beg to move.
My Lords, I will speak to Amendments 139, 140 and 109A—which was a bit of a late entry this morning—in my name. I express my thanks to those who have co-signed them.