Read Bill Ministerial Extracts
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateEarl of Erroll
Main Page: Earl of Erroll (Crossbench - Excepted Hereditary)Department Debates - View all Earl of Erroll's debates with the Department for Business and Trade
(1 month ago)
Lords ChamberMy Lords, I want to get on to the digital verification service. First, I declare that I am very interested in digital twins; there are huge advantages in modelling—for instance, the underground and all the various things that can choke traffic. I went to a very interesting event at Connected Places Catapult, where they are modelling all the inferences on traffic and flows, et cetera, to try to work out how you can alleviate it and get emergency services through when everything is choked up. There is huge advantage in being able to model that, and for that we need data sharing and all the other things.
The other thing I got very interested and involved in, with FIDO and Kaimai, is causal AI. As people say, we need to know how it got there: what sources was it relying on when it reached certain things that it put in the reports or decisions made? It is a very important aspect, because the “I” in AI is not really the right word to use. A computer is not innately intelligent. It is like a child; what you put into it and what it learns from that could well be not what you expected it to learn at all. We have to be very careful of believing too much in it, putting too much faith in it and thinking that it will run the future beautifully.
Here is the bit that I am more interested in. I was talking to my noble friend Lady Kidron just before the debate, and she pointed out something to me because of my previous involvement in chairing the British Standard PAS 1296 on age verification, which we did around the Digital Economy Act when we were worried about all the implications and the pornography issues. The trouble now is that the Government seem to think that the only age verification that matters is checking that someone is over 18, so that they can purchase alcohol and knives and view inappropriate pornography, which should not be there for youngsters. But when we wrote it, we were very careful to make sure that there was no age specified. For educational purposes, there is material that you want to go to particular age cohorts and do not want for children at other ages because it is wrong for their stage of development and knowledge. Also, you need to be able to check that older people are not trying to get into children’s social groups; they must be excludable from them. Age verification, whenever it is referred to, should work in any direction and at any age you want it. It should not be so inflexible.
I was sent a briefing by the Association of Document Validation Professionals and the Age Verification Providers Association. I was very much there when all that started off, when I was chairman of EURIM, which became the Digital Policy Alliance. They represent some 50 attribute and identity providers, and they warmly welcome the Bill and the priority that the new Government are giving to returning it to Parliament. I will highlight the sections of the Bill dealing with digital verification services that they feel would merit further discussion during later stages.
In Clause 27, “Introductory”, ideally there would be a clear purpose statement that the Bill makes certified digital ID legally valid as a proof of identity. This has always been a problem. To progress in the digital world, we will need to ensure that digital verification of ID is given equal standing with physical passports and driving licences. These data are technically only tokens to enable you to cross borders or drive a vehicle, but they are frequently used as proof of ID. This can be done digitally, and the benefit of digital ID is that it is much harder to forge and therefore much more reliable. For some reason, we have always had a fear of that in the past.
In Clause 29, on “supplementary codes”, they are very worried that it could add time and cost to developing these if these processes are owned by the Office for Digital Identities and Attributes—OfDIA. There should be a stronger obligation to co-create with the industry, both in preparing the initial rules and in any revisions. The drafting is, apparently, currently ambiguous about any requirements for consultation. I know that that has been a problem in the past. There will be specialist requirements particular to specific sectors and the OfDIA will not necessarily have the required expertise in-house. There are already organisations in place to do things around each of these codes.
In Clause 33, on registration on the digital verification services register, the changes to the previous Bill around registration processes are welcome and, most notably, the Government have recognised in the Bill the need for national security checks. The problem is that there is no independent appeals mechanism if the Secretary of State refuses to register a DVS or removes it from the register, short of judicial review—and that is both time consuming and very expensive. Most would not be able to survive long enough to bring the case to a conclusion, so we need to think of other remedies, such as some form of appeals tribunal.
In Clause 39, on the fees for registration et cetera, the fees are a new tax on the industry and may go beyond raising sufficient funds for the costs of administering the scheme. They welcome fees now being subject to parliamentary scrutiny, but would like to see a statutory limit on raising more than is required to fund DVS governance. There are figures on it which I could give you, but I will not bore you with them right now.
In Clause 50, on trust marks for use by registered persons, there may be a benefit from more direct linking of the requirements relating to marks of conformity to the Trade Marks Act.
In Clause 51, on the powers of a Secretary of State to require information, this wide-ranging power to demand information may inherently override the Data Protection Act. It extinguishes any obligation of confidentiality owed by a conformity assessment body to its clients, such as the contents of an audit report. The net effect could be to open up audit reports to freedom of information requests, because the exemption to an FoI would be that they were confidential, but the Bill appears to override that, and the way the Bill is structured could mean that the Secretary of State can also override a court order imposing confidentiality. I do not think we should allow that.
Clause 52 is about arrangements for third parties to exercise functions. In its current form, the Office for Digital Identities and Attributes is an unusual regulator. It is not independent from the Government and does not share the features of other regulators. It may therefore not be able to participate in the Digital Regulation Cooperation Forum, for example, based on the powers relied upon by its members to collaborate with other regulators.
The OfDIA may not be in scope of regulatory duty for most regulators to promote growth. It is unclear whether the new regulatory innovation office will have jurisdiction over the OfDIA. It would be helpful to explore whether a more conventional status as an independent regulator would be preferable.
I think that is enough complication for the moment.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateEarl of Erroll
Main Page: Earl of Erroll (Crossbench - Excepted Hereditary)Department Debates - View all Earl of Erroll's debates with the Department for Business and Trade
(3 weeks, 1 day ago)
Grand CommitteeMy Lords, I would like to say a few things about this. The first is that Amendment 5, in the name of the noble Lord, Lord Lucas, is very sensible; sometimes the GDPR has gone too far in trying to block what you can use things for. It was originally thought of when so much spamming was going on, with people gathering data from adverts and all sorts of other things and then misusing it for other purposes. People got fed up with the level of spam. This is not about that sort of thing; it is about having useful data that would help people in the future, and which they would not mind being used for other purposes. As long as it is done properly and seriously, and not for marketing, advertising and all those other things, and for something which is useful to people, I cannot see what the problem is. An overzealous use of GDPR, which has happened from time to time, has made it very difficult to use something perfectly sensible, which people would not mind having other people know about when it is being useful.
The next matter is sex, which is an interesting issue. The noble Lord is absolutely correct that biological or genetic sex is vital when applying medicines and various other things. You have to know that you are administering certain drugs properly. As we get more and more new drugs coming on, it will matter how a person’s body will react to them, which will depend on the genetic material, effectively. Therefore, it is essential to know what the biological sex is. The answer is that we need another category—probably “current gender”—alongside “sex at birth”. Someone can then decide to use “current gender” for certain purposes, including for such things as passports and driving licences, where people do not want to be asked questions—“Oh, do you mean you’re not?”—because they look completely different.
I remember meeting April Ashley in her restaurant. I would not, in my innocence—I was quite young—have guessed that she was not a woman, except that someone said that her hands were very big. It never worried us in those days. I am not worried about people using a different gender, but the basic underlying truth is essential. It comes into the issue of sport. If you have grown up and developed physically as a biological male, your bone structure and strength are likely to be different from that of a female. There are huge issues with that, and we need to know both; people can decide which to use at certain points. Having both would give you the flexibility to do that.
That also applies to Amendment 200, from the noble Lord, Lord Lucas, which is exactly the same concept. I thoroughly agree with those amendments and think we should push them forward.
My Lords, I too am delighted that the noble Lord, Lord Lucas, came in to move his amendment. He is the expert in that whole area of education data; like the noble Lord, Lord Arbuthnot, I found what he said extremely persuasive.
I need to declare an interest as chair of the council of Queen Mary, University of London, in the context of Amendment 5 in the name of the noble Lord, Lord Lucas. I must say, if use were made of that data, it would benefit not only students but universities. I am sure that the Minister will take that seriously but, on the face of it, like the noble Earl, Lord Erroll, I cannot see any reason why this amendment should not be adopted.
I very much support Amendments 34 and 48 in the name of the noble Lord, Lord Arbuthnot. I too have read the briefing from Sex Matters. The noble Lord’s pursuit of accuracy for the records that will be part of the wallet, if you like, to be created for these digital verification services is a matter of considerable importance. In reading the Sex Matters briefing, I was quite surprised. I had not realised that it is possible to change your stated sex on your passport in the way that has taken place. The noble Lord referred to the more than 3,000 cases of this; for driving licences, there have been more than 15,000.
I agree with Sex Matters when it says that this could lead to a loss of trust in the system. However, I also agree with the noble Earl, Lord Erroll, that this is not an either/or. It could be both. It is perfectly feasible to have both on your passport, if you so choose. I do not see this as a great divide as long as the statement about sex is accurate because, for a great many reasons—not least in healthcare—it is of considerable importance that the statement about one’s sex is accurate.
I looked back at what the Minister said at Second Reading. I admit that I did not find it too clear but I hope that, even if she cannot accept these amendments, she will be able to give an assurance that, under this scheme—after all, it is pretty skeletal; we will come on to some amendments that try to flesh it out somewhat—the information on which it will be based is accurate. That must be a fundamental underlying principle. We should thank the noble Lord, Lord Arbuthnot, for tabling these two important amendments in that respect.
Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateEarl of Erroll
Main Page: Earl of Erroll (Crossbench - Excepted Hereditary)Department Debates - View all Earl of Erroll's debates with the Department for Business and Trade
(1 week ago)
Grand CommitteeMy Lords, I shall also speak extremely briefly, as one of the three veterans of the Joint Committee present in Committee today, to reinforce my support for these amendments. The Government should be congratulated on Clause 123. It is welcome to see this movement but we want to see this done quickly. We want to ensure that it is properly enforceable, that terms of service cannot be used to obstruct access to researchers, as the noble Lord, Lord Bethell, said, and that there is proper global access by researchers, because, of course, these are global tech companies and UK users need to be protected through transparency. It is notable that, in the government consultation on copyright and AI published yesterday, transparency is a core principle of what the Government are arguing for. It is this transparency that we need in this context, through independent researchers. I strongly commend these amendments to the Minister.
My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.
I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.
My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, I strongly support this amendment. As a former Minister, I was at the front line of genomic data and know how powerful it currently is and can be in the future. Having discussed this with the UK Biobank, I know that the issue of who stores and processes genomic data in the UK is a subject of huge and grave concern. I emphasise that the American Government have moved on this issue already and emphatically. There is the possibility that we will be left behind in global standards and will one day be an outlier if we do not close this important and strategically delicate loophole. For that reason, I strongly support this amendment.
My Lords, I was involved in an ethics committee that looked at genomics and cancer research some years ago, and this is very important. If research could be done on different genomic and racial types, it could be used against us adversely at some point. So there is a lot of sense in this.
My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.
My Lords, I will speak briefly in support of this amendment. Anyone who has written computer code, and I plead guilty, knows that large software systems are never bug-free. These bugs can arise because of software design errors, human errors in coding or unexpected software interactions for some input data. Every computer scientist or software engineer will readily acknowledge that computer systems have a latent propensity to function incorrectly.
As the noble Baroness, Lady Kidron, has already said, we all regularly experience the phenomenon of bug fixing when we download updates to software products in everyday use—for example, Office 365. These updates include not only new features but patches to fix bugs which have become apparent only in the current version of the software. The legal presumption of the proper functioning of “mechanical instruments” that courts in England and Wales have been applying to computers since 1999 has been shown by the Post Office Horizon IT inquiry to be deeply flawed. The more complex the program, the more likely the occurrences of incorrect functioning, even with modular design. The program at the heart of Fujitsu’s Horizon IT system had tens of millions of lines of code.
The unwillingness of the courts to accept that the Horizon IT system developed for the Post Office was unreliable and lacking in robustness—until the key judgment, which has already been mentioned, by Mr Justice Fraser in 2019—is one of the main reasons why more than 900 sub-postmasters were wrongly prosecuted. The error logs of any computer system make it possible to identify unexpected states in the computer software and hence erroneous system behaviour. Error logs for the Horizon IT system were disclosed only in response to a direction from the court in early 2019. At that point, the records from Fujitsu’s browser-based incident management system revealed 218,000 different error records for the Horizon system.
For 18 years prior to 2019, the Post Office did not disclose any error log data, documents which are routinely maintained and kept for any computer system of any size and complexity. Existing disclosure arrangements in legal proceedings do not work effectively for computer software, and this amendment concerning the electronic evidence produced by or derived from a computer system seeks to address this issue. The Post Office Horizon IT inquiry finished hearing evidence yesterday, having catalogued a human tragedy of unparalleled scale, one of the most widespread miscarriages of justice in the UK. Whether it is by means of this amendment or otherwise, wrongful prosecutions on the basis that computers always operate properly cannot continue any longer.
My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.
My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.
All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.
We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.