Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Business and Trade
(3 months, 1 week ago)
Lords ChamberMy Lords, I declare an interest in that, through the Good Schools Guide, I am an extensive user of government schools data. With another hat on, I share my noble friend Lord Markham’s worries about how this affects little organisations with a bit of membership data.
I very much look forward to Committee, when we will get into the Bill’s substance. I supported almost everything that the noble Baroness, Lady Kidron, said and look forward to joining in on that. I also very much support what my noble friend Lord Holmes said, in particular about trust, so he will be glad to know that I have in advance consulted Copilot as to the changes they would like to see in the Bill. If I may summarise what they said—noble Lords will note that I have taken the trouble to ascertain their choice of pronouns—they would like to see enhanced privacy safeguards, better transparency and accountability, regular public consultation and reviews of the Act, impact assessments before implementation, support for smaller entities and clearer definition of key terms. I am delighted by how much I find myself in agreement with our future overlords.
To add to what the noble Earl, Lord Erroll, said about digital identity being better, there was a widespread demonstration of that during Covid, when right-to-work checks went digital. Fraud went down as a result.
On the substantial changes that I would like to see, like my noble friend Lord Arbuthnot of Edrom, I would like a clear focus on getting definitions of data right. It is really important that we have stability and precision in data. What has been going on in sex and gender in particular is ridiculous. Like many other noble Lords, I also want a focus on the use of artificial intelligence in hiring. It is so easy now to get AI support for making a job application that the number of job applications has risen hugely. In response to this, of course, AI has been used in assessing job applications, because you really cannot plough through 500 in order to make a shortlist. Like the Better Hiring Institute, which I am associated with, I would really like to see AI used to give people the reasons why they have not been successful. Give everybody a reply and engage everybody in this process, rather than just ignoring them—and I apologise to the many people who send me emails that I do not reply to, but perhaps I will do better with a bit of AI.
This is a very seasonal Christmas tree of a Bill and I shall not be shy of hanging baubles on it when we come to Committee, in the way that many other noble Lords have done. My choices include trying to make it possible for the Student Loans Company to be more adventurous in the use of its data. It ought to be a really good way of finding out how successful our university system is. It is in touch with university graduates in a way that no other organisation is, but it feels constrained in the sorts of questions it might ask. I would really like Action Fraud to record all attempts at fraud, not just the successful frauds. We need a better picture of what is going on there. I would like to see another attempt to persuade the DfE that schools admissions data should be centrally gathered. At the moment it is really hard for parents to use, which means there is a huge advantage for parents who are savvy and have the time. That is not the way it should be. Everybody should have good, intelligent access to understanding what schools are open to them. There will be plenty of opportunities in Committee, which, as I say, I look forward to.
In the context of data and House of Lords reform, when I did a snap census at 5.47 pm, the Cross-Bench Peers were in the majority in the House. That suggests that, in providing Peers who have a real interest in the core business of this House—revising legislation—the process of choosing Cross-Bench Peers does rather better than the process of choosing the rest of us. If we are to reform the House of Lords, getting that aspect into the political selection would be no bad thing. I would also like some data, in the sense of some clear research, on the value of Statement repeats. I cannot recall an occasion when a Statement repeat resulted in any change of government policy of any description. Perhaps other noble Lords can enlighten me.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Business and Trade
(3 months ago)
Grand CommitteeMy Lords, I apologise to the Committee for having not expected things to go quite as fast as they did. In moving Amendment 5, I will also speak to Amendments 200 and 202 in this group.
Amendment 5 is very much to do, in my mind, with the Office for Students and the Student Loans Company, but it is about a problem of more generality, in that public bodies that hold a great deal of customer data find that they are unable to use that access and understanding for the greater public good. In the particular instance of the Student Loans Company, it is in active touch with most British young people who have been through university and is in an excellent position to help us understand the quality of the university courses that they have been through and looked back on a few years later so that we can get data and information that will enable universities to improve those courses for future students. That is important feedback that we ought to have in our university system. Otherwise, universities just concentrate on students who are there now; the moment those students leave, the universities are not interested any more, until they are old enough to get donations out of.
We should have a much better and more self-improving system, which could be driven through the Student Loans Company. I have in the past asked the company whether it would feel able to participate in such a thing, and it said no, it would not be permitted by data protection regulations to communicate in this way with the students it looks after. We should give ourselves the power to consider that in this Bill, so that we can look at how we could use that data to make life better for future generations of students.
There are other examples of where the public realm has gathered data and contact information on people to do with a particular set of transactions but feels unable to communicate with them again to do something slightly wider than that, so I suggest to the Government that something along the lines of Amendment 5 would open some very interesting doors to improving the performance of the public realm.
Amendment 200 is on a completely different subject: how we properly define the data we are collecting so that across the public realm a particular dataset means the same thing. The instance I choose to illustrate this is sex. One would have thought that sex means male or female and, in fact, properly construed, there are only two sexes, and I hope the Supreme Court will agree in due course. Gender can be as wide as you like, but sex has two possible values, male or female. If we are collecting data on that in the National Health Service, the police service and other aspects of life to see whether we are treating men and women equally, it is very important that that data item should mean the same thing, but the police now routinely record rapes as being committed by women because the person convicted of rape chooses to identify as a woman because they think they will then get better treatment thereafter. If you are recording gender, it can be what you want, but if you are recording sex, it should be male or female.
It is really important within the National Health Service that we always mean male or female because male and female physiology differs, and if someone is a candidate for a particular treatment, it may well depend on their sex. For instance, in blood transfusions, it is important to know whether the donation came from a man or a woman, because people may react in different ways to the blood.
Having a data dictionary within government that defines particular terms for use in government statistics so that statistics collected across different departments are comparable and mean the same thing, so that you can work with them knowing exactly what they mean, ought to be part of the way we run government. Certainly, whenever I have been involved in collecting data within a largish business, data dictionaries have been common.
Lastly, I turn to a third entirely different subject, which is schools admissions data. There is provision in legislation for schools admissions authorities to publish admissions data. This, when it started, was quite useful. Local authorities would publish booklets and you could pick up a booklet for your local authority and see what the admissions rules were for all the schools in that local authority and what the outcomes of those rules had been in previous years. With a little work, you could understand which schools your child had a chance of getting into. That would then form the basis of the investigations you would do about which school you should be using. Over time, the quality of this data has degraded, mostly because the concept of an admissions authority has moved far beyond local authorities, which is where it used to be. Many individual schools and school groups are now their own admissions authority, and they do not share data with the local authority, which means that there is now—certainly in the local authorities I have looked at recently—no consolidated source of schools admissions information, either on the rules prospective pupils are subject to or on the outcomes in previous years.
That makes it a much longer and harder business to establish which schools your child has a right to go to, and the result is that it is only the socially advantaged who can find out what their options are. Anyone short of time or data literacy finds it difficult to know anything beyond which their nearest school is and to see all the other options that might be available to them.
That is something which we should turn around, and the way to do so is to make all admissions authorities drop their data into a common database. That is not difficult—it might take someone of medium talent about a day to design—and all schools have this data in a form that is easy to drop into a database, because that data is subject to a data dictionary. Terms are defined, and you know what they mean because they have to be interpreted in a consistent way by parents. It is a really easy thing to create.
Once the data is all in one place, it would be much easier for parents to establish which schools they could send their children to. It would be an opportunity for businesses of all sorts to help parents to make that easier. We ought to be putting ourselves in a position where we are making sure that we do not disadvantage people because they are disadvantaged. We should look after people who find it difficult to deal with differently arranged and differently stated sets of admissions criteria. We should not be disadvantaging people like that; we ought to—it is really quite simple—put them in a position where they are on a level footing with everyone else. I beg to move.
My Lords, I am very grateful to the Minister for her reply and to my noble friends and others for their interventions before that. I am delighted that she considers that Clause 2(3)(a) covers my Amendment 5. If I have any further concerns about that when I have reread her reply in Hansard, I will write to her.
I am sure that we need to do something about data integrity across the piece. I will very much take into account what the Minister has said about the Sullivan review and how sex data is or might be recorded in the future. However, it is a considerable problem that there is no reliable source of it, particularly when it comes to deciding how to treat people medically but also in other circumstances, as my noble friend has said, such as prisons and sports. We have to think through how to have a reliable source of it, which is clearly not passports, while for those with a gender recognition certificate, birth certificates are not a reliable source of information. There are obviously other aspects of life, too, where one wants to know that the data being collected is accurate.
So far as schools’ admissions regulations are concerned, I am afraid the state of the matter is that local authorities are no longer publishing the data that they ought to. The previous Government, who had plenty of time to enforce it, did not and this Government have not yet picked up on that. I will read what the Minister has said and pursue her colleagues in the Department for Education to see if we can get some improvement on the current state of affairs. With thanks to the Minister, I beg leave to withdraw my amendment.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Business and Trade
(2 months, 3 weeks ago)
Grand CommitteeMy Lords, I declare an interest in that I checked yesterday and Copilot has clearly scraped data from behind the paywall on the Good Schools Guide. It very kindly does not publish the whole of the review, but it publishes a summary of it. It concerns me how we police copyright and how we get things right in this Bill.
However, I do not think that trying to draw a boundary around “scientific” is the right way to do it. Looking at all the evidence on engineering biology that we have just taken for the Science and Technology Committee, they are all doing science, but they all want to make money out of it at the end, if things go right. There is no sensible boundary between science and commerce. We should expect that, with anything that is done for science, even if it is done in the social sciences, someone at the end of the day will want to build a consultancy on it. There is no defendable boundary between the two.
As my noble friend Lord Camrose said, getting a working definition of public interest is key, as is, in the context of this amendment, recognising the importance of the concepts of intellectual property, copyright, trademark, patents and so on. They are international concepts, and we should seek to hold the line in the face of technological challenges because the concepts as they are have shown their worth. We may have to adapt them in one way or another, but this should be an international thing, and we should not support local infringement, because we would then make the UK a much less worthwhile place to hold intellectual property. My intellectual property is not mobile but a lot of it is, and it wants to be held in a place where it can be defended. If we do not offer that in our legal system, we will lose a great deal by it.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Business and Trade
(2 months, 2 weeks ago)
Grand CommitteeMy Lords, I was in such a hurry to apologise just now for missing Second Reading that I forgot to declare my interests and remind the Committee of my technology and, with regard to this group, charitable interests as set out in the register.
I shall speak to Amendments 95, 96, 98, 101, 102 and 104 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson of Balmacara, and my noble friend Lord Black of Brentwood, and Amendments 103 and 106 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson. I also support Amendment 162 in the name of the noble Lord, Lord Clement-Jones. I will speak only on the marketing amendments in my name and leave the noble Lord, Lord Clement-Jones, to do, I am sure, great justice to the charitable soft opt-in.
These amendments are nothing like as philosophical and emotive as the last amendment on children and AI. They aim to address a practical issue that we debated in the late spring on the Data Protection and Digital Information Bill. I will not rehearse the arguments that we made, not least because the Minister was the co-signatory of those amendments, so I know she is well versed in them.
Instead, I shall update the Committee on what has happened since then and draw noble Lords’ attention to a couple of the issues that are very real and present now. It is strange that all Governments seem reluctant to restrict the new technology companies’ use of our data but extremely keen to get into the micro detail of restricting older forms of our using data that we have all got quite used to.
That is very much the case for the open electoral register. Some 63% of people opt out of being marketed at, because they have put their name as such on the electoral register. This is a well known and well understood use of personal data. Yet, because of the tribunal ruling, it is increasingly the case that companies cannot use the open electoral register and target the 37% of people who have said that they are quite happy to receive marketing unless the company lets every single one of those users know that they are about to market to them. The danger is that we create a new cookie problem—a physical cookie problem—where, if you want to use a data source that has been commonplace for 40 years, you have to send some marketing to tell people that you are about to use it. That of course means that you will not do so, which means that you reduce the data available to a lot of small and medium-sized businesses to market their products and hand them straight to the very big tech companies, which are really happy to scrape our data all over the place.
This is a strange one, where I find myself arguing that we should just allow something that is not broken not to need to be fixed. I appreciate that the Minister will probably tell us that the wording in these amendments is not appropriate. As I said earlier in the year—in April, in the previous incarnation—I very much hope that if the wording is incorrect we could, between Committee and Report, have a discussion and agree on some wording that achieves what seems just practical common sense.
The tribunal ruling that created this problem recognised that it was causing a problem. It stated that it accepted that the loophole it created would allow one company, Experian, a sizeable competitive advantage. It is a slightly perverse one: it means that it has to let only 5 million people know that it might be about to use the open electoral register, while its competitors have to let 22 million people know. That just does not pass the common-sense test of practical use of data. Given the prior support that the Minister has shown for this issue, I very much hope that we can resolve it between Committee and Report. I beg to move.
My Lords, I have a couple of amendments in this group, Amendments 158 and 161. Amendment 158 is largely self-evident; it tries to make sure that, where there is a legal requirement to communicate, that communication is not obstructed by the Bill. I would say much the same of Amendment 161; that, again, it is obvious that there ought to be easy communication where a person’s pension is concerned and the Bill should not obstruct it. I am not saying that these are the only ways to achieve these things, but they should be achieved.
I declare an interest on Amendment 160, in that I control the website of the Good Schools Guide, which has advertising on it. The function of advertising on the web is to enable people to see things for free. It is why it does not close down to a subscription-only service. If people put advertisements on the web, they want to know that they are effective and have been seen, and some information about who they have been seen by. I moved a similar amendment to the previous Government’s Bill and encountered some difficulty. If the Government are of the same mind—that this requires us to be careful—I would very much welcome the opportunity of a meeting between now and Report, and I imagine others would too, to try to understand how best to make sure that advertising can flourish on the internet.
I am very happy to talk to the noble Baroness about this issue. She asked what the Government’s view is; we are listening very carefully to the Information Commissioner and the advice that he is putting together on this issue.
My Lords, I am very grateful for the answers the noble Baroness gave to my amendments. I will study carefully what she said in Hansard, and if I have anything further to ask, I will write to her.
My Lords, in response—and very briefly, given the technical nature of all these amendments—I think that we should just note that there are a number of different issues in this group, all of which I think noble Lords in this debate will want to follow up. I thank the many noble Lords who have contributed both this time round and in the previous iterations, and ask that we follow up on each of the different issues, probably separately rather than in one group, as we will get ourselves quite tangled in the web of data if we are not careful. With that, I beg leave to withdraw the amendment.
My Lords, my Amendment 115 would similarly act in that way by making automated decision-making processes explain themselves to the people affected by them. This would be a much better way of controlling the quality of what is going on with automated decision-making than restricting that sort of information to professionals—to people who are anyway overworked and have a lot of other things to do. There is no one more interested in the decision of an automated process than the person about whom it is being made. If we are to trust these systems then their ability, which is way beyond the human ability, to have the time to explain why they took the decision they did—which, if the machine is any good, it knows and can easily set out—is surely the way to generate trust: you can absolutely see what decision has been made and why, and you can respond to it.
This would, beyond anything else, produce a much better system for our young people when they apply for their first job. My daughter’s friends in that position are getting into the hundreds of unexplained rejections. This is not a good way to treat young people. It does not help them to improve and understand what is going on. I completely understand why firms do not explain; they have so many applications that they just do not have the time or the personnel to sit down and write a response—but that does not apply to an automated decision-making machine. It could produce a much better situation when it comes to hiring.
As I said, my principal concern, to echo that of the noble Viscount, is that it would give us sight of the decisions that have been taken and why. If it becomes evident that they are taken well and for good reasons, we shall learn to trust them. If it becomes evident that they really are not fair or understandable, we shall be in a position to demand changes.
My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.
Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.
As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.
When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.
It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.
That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.
Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.
In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.
My understanding is that it would be. Every individual who was affected would receive their own notification rather than it just being on a website, for example.
Let me just make sure I have not missed anyone out. On Amendment 123B on addressing bias in automated decision-making, compliance with the data protection principles, including accuracy, transparency and fairness, will ensure that organisations take the necessary measures to address the risk of bias.
On Amendment 123C from the noble Lord, Lord Clement-Jones, I reassure him that the Government strongly agree that employment rights should be fit for a modern economy. The plan to make work pay will achieve this by addressing the challenges introduced by new trends and technologies. I agree very much with my noble friend Lord Knight that although we have to get this right, there are opportunities for a different form of work, and we should not just see this as being potentially a negative impact on people’s lives. However, we want to get the balance right with regard to the impact on individuals to make sure that we get the best rather than the possible negative effects out of it.
Employment rights law is more suitable for regulating the specific use of data and technology in the workplace rather than data protection law in isolation, as data protection law sets out general rules and principles for processing that apply in all contexts. Noble Lords can rest assured that we take the impact on employment and work very seriously, and as part of our plan to make work pay and the Employment Rights Bill, we will return to these issues.
On Amendments 119, 120, 121 and 122, tabled by the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and my noble friend Lord Knight, the Government share the noble Lords’ belief in the importance of public sector algorithmic transparency, and, as the noble Lord, Lord Clement-Jones, reminded us, we had a very good debate on this last week. The algorithmic transparency recording standard is already mandatory for government departments and arm’s-length bodies. This is a cross-government policy mandate underpinned by digital spend controls, which means that when budget is requested for a relevant tool, the team in question must commit to publishing an ATRS record before receiving the funds.
As I said on Friday, we are implementing this policy accordingly, and I hope to publish further records imminently. I very much hope that when noble Lords see what I hope will be a significant number of new records on this, they will be reassured that the nature of the mandation and the obligation on public sector departments is working.
Policy routes also enable us to provide detailed guidance to the public sector on how to carry out its responsibilities and monitor compliance. Examples include the data ethics framework, the generative AI framework, and the guidelines for AI procurement. Additionally, the data protection framework already achieves some of the intended outcomes of these amendments. It requires organisations, including public authorities, to demonstrate how they have identified and mitigated risks when processing personal data. The ICO provides guidance on how organisations can audit their privacy management and ensure a high level of data protection compliance.
I know I have given a great deal of detail there. If I have not covered all the points that the noble Lords have raised, I will write. In the meantime, given the above assurances, I hope that the noble Lord will withdraw his amendment.
My Lords, I would be very grateful if the Minister wrote to me about Amendment 115. I have done my best before and after to study Clause 80 to understand how it provides the safeguards she describes, and have failed. If she or her officials could take the example of a job application and the responses expected from it, and take me through the clauses to understand what sort of response would be expected and how that is set out in the legislation, I would be most grateful.
My Lords, I have Amendment 201 in this group. At the moment, Action Fraud does not record attempted fraud; it has to have been successful for the website to agree to record it. I think that results in the Government taking decisions based on distorted and incomplete data. Collecting full data must be the right thing to do.
My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.
I have Amendment 135A in this group. The Bill provides a new set of duties for the Information Commissioner but no strategic framework, as the DPDI Bill did. The Information Commissioner is a whole-economy regulator. To my mind, the Government’s strategic priorities should bear on it. This amendment would provide an enabling power, such as that which the Competition and Markets Authority, which is in an equivalent economic position, already has.
My Lords, I have huge sympathy for, and experience of, many of the issues raised by the noble Lord, Lord Clement-Jones, but, given the hour, I will speak only to Amendment 145 in my name and those of the noble Baroness, Lady Harding, my noble friend Lord Russell and the noble Lord, Lord Stevenson. Given that I am so critical, I want to say how pleased I am to see the ICO reporting requirements included in the Bill.
Amendment 145 is very narrow. It would require the ICO to report specifically and separately on children. It is fair to say that one of the many frustrations for those of us who spend our time advocating for children’s privacy and safety is trying to extrapolate child-specific data from generalised reporting. Often it is not reported because it is useful to hide some of the inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snapchat provides a breakdown of the violation rate by age group, even though that would provide valuable information for academics, Governments, legislators, NGOs and, of course, regulators. It was a point of contention between many civil society organisations and Ofcom that there was no evidence that children of different ages react in different ways, which, for anyone who has had children, is clearly not the case.
Similarly, for many years we struggled to understand Ofcom’s reporting because older children were included in a group that went up to 24, and it took over 10 years for that to change. It seems to me—I hope the Government agree—that since children are entitled to specific data privacy benefits, it follows that the application and enforcement of those benefits should be reported separately. I hope that the Government can give a quick yes on this small but important amendment.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Business and Trade
(2 months, 1 week ago)
Grand CommitteeMy Lords, I make a brief intervention. I am not against these amendments —they are very useful in the context of the Bill. However, I am reflecting on the fact that, when we drafted GDPR, we took a six-year process and failed in the course of doing so to really accommodate AI, which keeps popping up every so often in this Bill. Every part of every amendment seems to have a new subsection referring to automative decisions or to AI generally.
Obviously, we are moving on to have legislation in due course on AI and I am sure that a number of pieces of legislation, including no doubt this one, will be able to be used as part of our overall package when we deal with the regulation of AI. However, although it is true that the UK GDPR gives, in theory, a higher standard of protection for children, it is important to consider that, in the context of AI, the protections that we need to have are going to have to be much greater—we know that. But if there is going to be a code of practice for children and educational areas, we need also to consider vulnerable and disabled people and other categories of people who are equally entitled to have, and particularly with regard to the AI elements need to have, some help. That is going to be very difficult. Most adults whom I know know less about AI than do children approaching the age of 18, who are much more knowledgeable. They are also more knowledgeable of the restrictions that will have to be put in place than are adults, who appear to be completely at sea and not even understanding what AI is about.
I make a precautionary point. We should be very careful, while we have AI dotted all the way through this, that when we specify a particular element—in this case, for children—we must be aware of the need to have protection in place for other groups, particularly in the context of this Bill and, indeed, future legislation.
My Lords, I very much support the thrust of these amendments and what the noble Lord, Lord Knight, said in support of and in addition to them. I declare an interest as a current user of the national pupil database.
The proper codification of safeguards would be a huge help. As the noble Baroness, Lady Kidron, said, it would give us a foundation on which to build. I hope that, if they are going to go in this direction, the Government will take an immediate opportunity to do so because what we have here, albeit much more disorganised, is a data resource equivalent to what we have for the National Health Service. If we used all the data on children that these systems generate, we would find it much easier to know what works and in what circumstances, as well as how to keep improving our education system.
The fact that this data is tucked away in little silos—it is not shared and is not something that can be used on a national basis—is a great pity. If we have a national code as to how this data is handled, we enable something like the use of educational data in the way that the NHS proposes to use health data. Safeguards are needed on that level but the Government have a huge opportunity; I very much hope that it is one they will take.
I start by thanking all noble Lords who spoke; I enjoyed the vivid examples that were shared by so many of them. I particularly enjoyed the comment from the noble Lord, Lord Russell, about the huge gulf in difference between guidance, of which there is far too much, and a code that actually drives matters forward.
I will speak much more briefly because this ground has been well covered already. Both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, seek to introduce codes of practice to protect the data of children in education services. Amendment 138 in the name of the noble Lord seeks to introduce a code on processing personal data in education. This includes consultation for the creation of such a code—a highly important element because the safety of this data, as well as its eventual usage, is of course paramount. Amendment 141 in the name of the noble Baroness, Lady Kidron, also seeks to set out a code of practice to provide heightened protections for children in education.
Those amendments are absolutely right to include consultation. It is a particularly important area of legislation. It is important that it does not restrict what schools can do with their data in order to improve the quality and productivity of their work. I was very appreciative of the words of the noble Lord, Lord Knight, when he sketched out some of the possibilities of what becomes educationally possible when these techs are wisely and safely used. With individual schools often responsible for the selection of technologies and their procurement, the landscape is—at the risk of understatement —often more complex than we would wish.
Alongside that, the importance of the AI Safety Institute’s role in consultation cannot be overstated. The way in which tech and AI have developed in recent years means that its expertise on how safely to provide AI to this particularly vulnerable group is invaluable.
I very much welcome the emphasis that these amendments place on protecting children’s data, particularly in the realm of education services. Schools are a safe place. That safety being jeopardised by the rapid evolution of technology that the law cannot keep pace with would, I think we can all agree, be unthinkable. As such, I hope that the Government will give careful consideration to the points raised as we move on to Report.
My Lords, I support Amendments 204, 205 and 206 in the names of my noble friends Lady Kidron and Lord Freyberg, and of the noble Lords, Lord Stevenson and Lord Clement-Jones, in what rapidly seems to be becoming the Cross-Bench creative club.
I spent 25 years as a professional photographer in London from the late 1980s. When I started, retouchers would retouch negatives and slides by hand, charging £500 an hour. Photoshop stopped that. Professional film labs such as Joe’s Basement and Metro would work 24 hours a day. Snappy Snaps and similar catered for the amateur market. Digital cameras stopped that. Many companies provided art prints, laminating and sundry items for professional portfolios. PDFs and websites stopped that. Many different forms of photography, particularly travel photography, were taken away when picture libraries cornered the market and drove down commissions to unsustainable levels. There were hundreds if not thousands of professional photographers in the country. The smartphone has virtually stopped that.
All these changes were evolution and the result of a world becoming more digitised, but AI web crawlers are different, illegally scraping images without consent or payment then potentially killing the trade of the victim by setting up in competition. This is a parasite, but not in the true sense, because a parasite is careful to keep its victims alive.
My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.
It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”
This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.
My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.
I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.
This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.
Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.
How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.
The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.
Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.
My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.
My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.
It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.
My Lords, environmental data, specifically such things as biodiversity data, is a key component to getting policy in this area right. To do so, we need to make sure that all the good data we are generating around the UK gets into our storage system, and that the best possible and most complete data is used whenever we make decisions.
We currently run that through a system of local environmental records centres that are independent and not for profit. Since that is the system we have, it ought to be run right. At the moment, we are failing to capture a lot of quality data because the data is not coming in from the planning system, or from other similar functions, in the way that it should. We are not consistently using that data in planning as we should. Natural England, which ought to be intimately linked into this system, has stepped away from it for budgetary reasons. The environment is important to us. If the Government are serious about that, we have to get our data collection and use system right. I beg to move.
My Lords, I thank the noble Lord, Lord Lucas, for his Amendment 211F. I absolutely agree that local environmental records centres provide an important service. I reassure noble Lords that the Government’s digital planning programme is developing data standards and tools to increase the availability, accessibility and usability of planning data. This will transform people’s experience of planning and housing, including through local environmental records centres. On that basis, I must ask the noble Lord whether he is prepared to withdraw his amendment.
My Lords, I am grateful for that extensive answer from the Minister. If I have anything that I hope that she might add, I will write to her afterwards.
My heart is always in the cause of making sure that the Government get their business done on time every time, and that we finish Committee stages when they ask, as doubtless they will discover with some of the other Bills they have in this Session. For now, I beg leave to withdraw my amendment.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Science, Innovation & Technology
(1 month, 1 week ago)
Lords ChamberMy Lords, in moving Amendment 6 in my name I will also to speak to Amendment 8. This section of the Bill deals with digital verification services, and the root word there is verify/veritas—truth. Digital verification input must be truthful for the digital system to work. It is fundamental.
One can find all sorts of workarounds for old analogue systems. They are very flexible. With digital, one has to be precise. Noble Lords may remember the case in November of baby Lilah from Sutton-in-Ashfield, who was registered at birth as male by accident, as she was clearly female. The family corrected this on the birth register by means of a marginal note. There is no provision in law to correct an error on a birth certificate other than a marginal note. That works in analogue—it is there on the certificate—but in digital these are separate fields. In the digital systems, her sex is recorded as male.
I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important. I know from my background in scientific research that, to know what you are dealing with, data is the most important thing to get. Making sure that we have a system to get this clear will be part of what we are doing.
Amendment 6 would require the Secretary of State to assess which public authorities can reliably verify related facts about a person in the preparation of the trust framework. This exercise is out of scope of the trust framework, as the Good Practice Guide 45—a standard signposted in the trust framework—already provides guidance for assessing the reliability of authoritative information across a wide range of use cases covered by the trust framework. Furthermore, the public authorities mentioned are already subject to data protection legislation which requires personal data processed to be accurate and, where relevant, kept up to date.
Amendment 8 would require any information shared by public authorities to be clearly defined, accompanied by metadata and accurate. The Government already support and prioritise the accuracy of the data they store, and I indicated the ongoing work to make sure that this continues to be looked at and improved. This amendment could duplicate or potentially conflict with existing protections under data protection legislation and/or other legal obligations. I reassure noble Lords that the Government believe that ensuring the data they process is accurate is essential to deliver services that meet citizens’ needs and ensure accurate evaluation and research. The Central Digital and Data Office has already started work on developing data standards on key entities and their attributes to ensure that the way data is organised, stored and shared is consistent.
It is our belief that these matters are more appropriately considered together holistically, rather than by a piecemeal approach through diverse legislation such as this data Bill. As such, I would be grateful if noble Lords would consider withdrawing their amendments.
My Lords, I am very grateful to all noble Lords who have spoken on this. I actually rather liked the amendments of the noble Lord, Lord Clement-Jones—if I am allowed to reach across to him—but I think he is wrong to describe Amendments 6 and 8 as “culture war”. They are very much about AI and the fundamentals of digital. Self-ID is an attractive thought; I would very much like to self-identify as a life Peer at the moment.
However, the truth should come before personal feelings, particularly when looking at data and the fundamentals of society. I hope that the noble Lord will take parliamentary opportunities to bring the framework in front of Parliament when it appears. I agree with him that Parliament should take an interest in and look at this, and I hope we will be able to do that through a short debate at some stage—or that he will be able to, because I suspect that I shall not be here to do so. It is important that, where such fundamental rights and the need for understanding are involved, there is a high degree of openness. However expert the consideration the Government may give this through the mechanisms the Minister has described, I do not think they go far enough.
So far as my own amendments are concerned, I appreciate very much what the Minister has said. We are clearly coming from the same place, but we should not let the opportunity of this Bill drift. We should put down the marker here that this is an absolutely key part of getting data and government right. I therefore beg leave to test the opinion of the House.
My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.
My Lords, the regulator quite clearly needs a standard against which to judge. Public interest is the established one in FOI, medicine and elsewhere. It is the standard that is used when I apply for data under the national pupil database—and quite right too. It works well, it is flexible, it is well understood and it is a decent test to meet. We really ought to insist on it today.
My Lords, I want to add very quickly that we have got a problem here. If someone did take all this private data because we did not put this block on them, and they then had it, it would probably become their copyright and their stuff, which they could then sit on and block other people getting at. This amendment is fairly essential.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Science, Innovation & Technology
(1 month, 1 week ago)
Lords ChamberMy Lords, I support what the noble Baroness, Lady Freeman, said. Her maiden speech was a forewarning of how good her subsequent speeches would be and how dedicated she is to openness, which is absolutely crucial in this area. We are going to have to get used to a lot of automatic processes and come to consider that they are by and large fair. Unless we are able to challenge it, understand it and see that it has been properly looked after, we are not going to develop that degree of trust in it.
Anyone who has used current AI programs will know about the capacity of AI for hallucination. The noble Lord, Lord Clement-Jones, uses them a lot. I have been looking, with the noble Lord, Lord Saatchi, at how we could use them in this House to deal with the huge information flows we have and to help us understand the depths of some of the bigger problems and challenges we are asked to get a grip on. But AI can just invent things, leaping at an answer that is easier to find, ignoring two-thirds of the evidence and not understanding the difference between reliable and unreliable witnesses.
There is so much potential, but there is so much that needs to be done to make AI something we can comfortably rely on. The only way to get there is to be absolutely open and allow and encourage challenge. The direction pointed out by the noble Lord, Lord Clement-Jones, and, most particularly by the noble Baroness, Lady Freeman, is one that I very much think we should follow.
My Lords, I will very briefly speak to Amendment 30 in my name. Curiously, it was in the name of the noble Viscount, Lord Camrose, in Committee, but somehow it has jumped.
On the whole, I have always advocated for age-appropriate solutions. The amendment refers to preventing children consenting to special category data being used in automated decision-making, simply because there are some things that children should not be able to consent to.
I am not sure that this exact amendment is the answer. I hope that the previous conversation that we had before the dinner break will produce some thought about this issue—about how automatic decision-making affects children specifically—and we can deal with it in a slightly different way.
While I am on my feet, I want to say that I was very struck by the words of my noble friend Lady Freeman, particularly about efficacy. I have seen so many things that have purported to work in clinical conditions that have failed to work in the complexity of real life, and I want to associate myself with her words and, indeed, the amendments in her name and that of the noble Lord, Lord Clement-Jones.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Science, Innovation & Technology
(1 month ago)
Lords ChamberMy Lords, I will also speak to Amendment 50A. I have sent the Government a reasonably lengthy explanation of what I am up to here, so I will restrict myself to a summary for the purposes of Report.
To my mind, there is a necessary distinction between a service message and a regulatory communication. A service message is to do with an existing contract, and you do not want them full of marketing material, but regulatory communications often have to contain something that would be judged by the ICO as marketing material—they are required to. Under those circumstances, there should be a required balancing between harms: the harm of not complying with what the regulator would like and the harm of issuing a marketing communication without permission.
This is never going to be simple. It is always going to be case-by-case, but we should recognise that there are times when regulators want to encourage people to take particular actions and want the service providers to be part of that. We should allow for that in the wording of the Bill. I beg to move.
My Lords, I will start with Amendments 48A and 50A in the name of the noble Lord, Lord Lucas. The Government are aware that some financial services firms have raised concerns that the direct marketing rules in the privacy and electronic communications regulations prevent them supporting consumers in some instances. I appreciate the importance of the support that financial services firms provide to their customers to help them make informed decisions on matters such as their financial investments. The Government and the FCA are working closely together to improve the support available to consumers.
In December, the FCA launched an initial consultation on a new type of support for consumers with their investments and pensions called “targeted support”. Through this consultation, the FCA will seek feedback on any interactions of the proposals and direct marketing rules. As my noble friend Lady Jones explained in the debate in Grand Committee, firms can already provide service or regulatory communication messages to their customers without permission, provided these messages are neutral in tone, factual and do not include promotional content. Promotional content can be sent if a consumer consents to receiving direct marketing. Messages which are not directed to a particular individual, such as online adverts shown to everyone who views a website, are also not prevented by the rules. I hope this explanation and the fact that there is ongoing work provide some reassurance to the noble Lord, Lord Lucas, that the Government are actively looking into this issue, and that, as such, he is content to withdraw his amendment.
Amendment 48B from the noble Lord, Lord Clement-Jones, is aimed at banning cookie paywalls. These generally work by giving web users the option to pay for a cookie-free browsing experience. Many websites are funded by advertising, and some publishers think that people should pay for a viewing experience without personalised advertising. As he rightly pointed out, the ICO released updated guidance on how organisations can deploy “consent or pay” models while still ensuring that consent is “freely given”. The guidance is detailed and outlines important factors that organisations should consider in order to operate legally. We encourage businesses to read this guidance and respond accordingly.
I note the important points that the noble Lord makes, and the counterpoints made by the noble Viscount, Lord Camrose. The Government will continue to engage with businesses, the ICO and users on these models, and on the guidance, but we do not think there is currently a case for taking action to ban the practice. I therefore hope the noble Lord will not press his amendment.
My Lords, I am grateful to the Minister for that explanation. I will, for the moment, be content to know that the Government are continuing to discuss this. There is a real problem here that will need to be dealt with, but if the Government are engaged they will inevitably find themselves having to deal with it. There are some occasions in regulatory messages where you need to make options clear: “You need to do this or something else will happen and you’ll really disadvantage yourself”. The regulator will expect that, particularly where things such as pensions are concerned, but it is clearly a marketing message. It will be difficult to be resolved, but I am happy to trust the Government to have a go at it and not to try to insist on the particular formulation of these amendments. I beg leave to withdraw my amendment.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Science, Innovation & Technology
(1 month ago)
Lords ChamberMy Lords, I very much encourage the Government to go down this road. Everyone talks about the NHS just because the data is there and organised. If we establish a structure like this, there are other sources of data that we could develop to equivalent value. Education is the obvious one. What works in education? We have huge amounts of data, but we do nothing with it—both in schools and in higher education. What is happening to biodiversity? We do not presently collect the data or use it in the way we could, but if we had that, and if we took advantage of all the people who would be willing to help with that, we would end up with a hugely valuable national resource.
HMRC has a lot of information about employment and career patterns, none of which we use. We worry about what is happening and how we can improve seaside communities, but we do not collect the data which would enable us to do it. We could become a data-based society. This data needs guarding because it is not for general use—it is for our use, and this sort of structure seems a really good way of doing it. It is not just the NHS—there is a whole range of areas in which we could greatly benefit the UK.
My Lords, all our speakers have made it clear that this is a here-and-now issue. The context has been set out by noble Lords, whether it is Stargate, the AI Opportunities Action Plan or, indeed, the Palantir contract with the NHS. This has been coming down the track for some years. There are Members on the Government Benches, such as the noble Lords, Lord Mitchell and Lord Hunt of Kings Heath, who have been telling us that we need to work out a fair way of deriving a proper financial return for the benefits of public data assets, and Future Care Capital has done likewise. The noble Lord, Lord Freyberg, has form in this area as well.
The Government’s plan for the national data library and the concept of sovereign data assets raises crucial questions about how to balance the potential benefits of data sharing with the need to protect individual rights, maintain public trust and make sure that we achieve proper value for our public digital assets. I know that the Minister has a particular interest in this area, and I hope he will carry forward the work, even if this amendment does not go through.
My Lords, if we are to live in a data-rich world, we really need a set of well-understood, good definitions for the basic information we are collecting. At the moment, age is about the only stable personal characteristic, in that we generally know where it comes from, where it is recorded and can trust it. Name has become unstable: people are using name changing to hide previous criminal convictions, because we do not have a system of linking one name with another. Residence is widely abused by people who want to get their kids into the school of their preference.
Disability, ethnicity, sexuality and religion are all self-identified. We really need to understand why we are basing policy on something that is self-identified and whether we are collecting the right information for the policy uses we are making of it, particularly when, in areas such as employment, we are encouraging people to make particular choices because they are favoured in the employment advertisements. There is a collection of information there which we really ought to make an effort to be clear about if we are to make proper use of it and understand data going down the decades.
The definition we ought to do something about now is the protected characteristic of sex, because the misuse of sex and its conflation with gender has caused a whole suite of disadvantages and corruptions in the system. Basically, sex is simple: there are only two sexes. For the huge majority of humans, you can easily determine which sex they are. There are some for whom it is harder, but there are still only two sexes. We are in a situation where we record sex and use it to provide safe spaces for women, to have female sports, to know which prison to put someone in, to know how to record crime and, presumably, to know what action to take as a result of it.
Sex and knowing how women are doing is a really important thing to collect accurately, because there is a whole suite of areas in which women have been historically disadvantaged, such as in employment. It is well known that the standards in medical care have been set on men, not women, which has led to a series of disadvantages. We need accurate data. To my mind, rules based on reality and truth that are then adapted to people are much better than rules based on the way we wished things were, then trying to reconcile that with the truth.
We would do better for everybody—women in particular, but also people who identify as trans—if we based our description of them, when it comes to sex, on the truth. We would provide better healthcare, better protection, a much easier attitude to integration into society and proper provision for them. We should seek to do this. Truth should be the base of how we collect data; we should really insist on that. We should not corrupt our data but adapt our practice. I beg to move.
My Lords, this one should be easy. Last week, we passed amendments that said that the public authorities, in recording data on matters including sex, should do so accurately. Some might think that that should not be particularly controversial. This amendment says that the Government “may make regulations” about definitions of that sort of thing—that is “may”, not must. It is a negative resolution, not a positive one. It is not difficult, so let us do it.
Amendment 67, tabled by the noble Lord, Lord Lucas, would require terms relating to personal attributes to be defined consistently across government data. The Government believe that public sector data should continue to be collected based on user needs for data and any applicable legislation, but I fully recognise the need for standards and consistency in data required for research and evaluation. Harmonisation creates more meaningful statistics that allow users to better understand a topic. It is also an important part of the code of practice for statistics; the code recommends using harmonised standards unless there is a good reason not to.
As I set out in last week’s debate, the Government believe that data accuracy is essential to deliver services that meet citizens’ needs and ensure accurate evaluation and research as a result of that. I will set out to the noble Lord some work that is ongoing in this space. The Office for Statistics Regulation published guidance on collecting and reporting data about sex and gender identity in February 2024, and the Government Statistical Service published a work plan for updated harmonised standards and guidance on sex and gender identity in December 2024 and will take into account the needs for accurate metadata. The Sullivan review explores these issues in detail and should be published shortly; it will be taken into account as the work progresses. In addition, the Government Digital Service has started work on developing data standards on key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities.
This work has been commenced via the domain expert group on the “person” entity, which has representation from organisations including the Home Office, HMRC, the Office for National Statistics, NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help ensure consistency across organisations.
As I said last week, it is the Government’s belief that these matters are crucial and need to be considered carefully, but are more appropriately considered holistically outside this Bill. The intention of this Bill is not to define or remark on the specific definitions of sex or gender, or other aspects of data definition. It is, of course, to make sure that the data that is collected can be made available, and I have reiterated my point that the data needs to be both based in truth and consistent and clear. There is work going on to make these new regulations and approaches to this absolutely clear. As such, I urge the noble Lord to consider withdrawing his amendment.
My Lords, I am very grateful to the Minister for that explanation. I am particularly glad to know that the Sullivan review will be published soon—I look forward very much to reading that—and I am pleased by the direction the Government are moving in. None the less, we only get a Bill every now and again. I do think we need to give the Government the powers that this amendment offers. I would hate noble Lords opposite to feel that they had stayed here this late to no purpose, so I beg leave to test the opinion of the House.