Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Science, Innovation & Technology
(7 months ago)
Grand CommitteeThat is one of the questions that I can now answer. The power will allow this, in so far as it pertains to helping the Secretary of State establish whether the benefits are being paid properly, as with paragraph 1(2) of new Schedule 3B. Rules around living together are relevant only to some benefits. That is a very short answer, but I could expand on it.
May I add to the very long letter? I have been sitting here worrying about this idea that one of the “signals” will be excess capital and then there are matching accounts. If the matching account has more capital—for example, the person who has a connected account is breaking the £16,000 or £6,000—does that signal trigger some sort of investigation?
That is a very fair question, and I hope that I understand it correctly. I can say that the limit for the DWP is that it can gain only from what the third party produces. Whatever goes on behind the doors of the third party is for them and not us. Whether there is a related account and how best to operate is a matter for the bank to decide. We may therefore end up getting very limited information, in terms of the limits of our powers. I hope that helps, but I will add some more detail in the letter.
My Lords, having listened carefully to representations from across the House at Second Reading, I am introducing this amendment to address concerns about the data preservation powers established in the Bill. The amendment provides for coroners, and procurators fiscal in Scotland, to initiate the data preservation process when they decide it is necessary and appropriate to support their investigations into a child’s death, irrespective of the suspected cause of death.
This amendment demonstrates our commitment to ensuring that coroners and procurators fiscal can access the online data they may need to support their investigation into a child’s death. It is important to emphasise that coroners and procurators fiscal, as independent judges, have discretion about whether to trigger the data preservation process. We are grateful to the families, Peers and coroners whom we spoke to in developing these measures. In particular, I thank the noble Baroness, Lady Kidron, who is in her place. I beg to move.
My Lords, it is an unusual pleasure to support the Minister and to say that this is a very welcome amendment to address a terrible error of judgment made when the Government first added the measure to the Bill in the other place and excluded data access for coroners in respect of children who died by means other than suicide. I shall not replay here the reasons why it was wrong, but I am extremely glad that the Government have put it right. I wish to take this opportunity to pay tribute to those past and present at 5Rights and the NSPCC for their support and to those journalists who understood why data access for coroners is a central plank of online safety.
I too recognise the role of the Bereaved Families for Online Safety. They bear the pain of losing a child and, as their testimony has repeatedly attested, not knowing the circumstances surrounding that death is a particularly cruel revictimisation for families, who never lose their grief but simply learn to live with it. We owe them a debt of gratitude for putting their grief to work for the benefit of other families and other children.
My Lords, Amendment 251 is also in the names of the noble Lords, Lord Arbuthnot and Lord Clement-Jones, and the noble Baroness, Lady Jones. I commend the noble Lord, Lord Arbuthnot, for his staunch support of the sub-postmasters over many years. I am grateful to him for adding his name to this amendment.
This amendment overturns a previous intervention in the law that has had and will continue to have far-reaching consequences if left in place: the notion that computer evidence should in law be presumed to be reliable. This error, made by the Government and the Law Commission at the turn of the century and reinforced by the courts over decades, has, as we now know, cost innocent people their reputations, their livelihoods and, in some cases, their lives.
Previously, Section 69 of the Police and Criminal Evidence Act 1984 required prosecutors in criminal cases relying on information from computers to confirm that the computer was operating correctly and could not have been tampered with before it submitted evidence. As the volume of evidence from computers increased, this requirement came to be viewed as burdensome.
In 1997, the Law Commission published a paper, Evidence in Criminal Proceedings: Hearsay and Related Topics, in which it concluded that Section 69
“fails to serve any useful purpose”.
As a result, it was repealed. The effect of this repeal was to create a common law presumption, in both criminal and civil proceedings, of the proper functioning of machines—that is to say, the computer is always right. In principle, there is a low threshold for rebutting this presumption but, in practice, as the Post Office prosecutions all too tragically show, a person challenging evidence derived from a computer will typically have no visibility of the system in question or the ways in which it could or did fail. As a result, they will not know what records of failures should be disclosed to them and might be asked for.
This situation was illustrated in the Post Office prosecution of sub-postmaster Mrs Seema Misra. Paul Marshall, Mrs Misra’s defence lawyer, describes how she was
“taunted by the prosecution for being unable to point to any … identifiable … problem”,
while they hid behind the presumption that the Horizon system was “reliable” under the law. On four occasions during her prosecution, Mrs Misra requested court order disclosure by the Post Office of Horizon error records. Three different judges dismissed her applications. Mrs Misra went to prison. She was eight weeks pregnant, and it was her son’s 10th birthday. On being sentenced, she collapsed.
The repeal of Section 69 of PACE 1984 reflects the Law Commission’s flawed belief that most computer errors were “down to the operator” or “apparent to the operator”, and that you could
“take as read that computer evidence is reliable unless a person can say otherwise”.
In the words of a colleague of mine from the University of Oxford, a professor of computing with a side consultancy specialising in finding bugs for global tech firms ahead of rollout, this assumption is “eye-wateringly mistaken”. He recently wrote to me and said:
“I have been asking fellow computer scientists for evidence that computers make mistakes, and have found that they are bewildered at the question since it is self-evident”.
There is an injustice in being told that a machine will always work as expected, and a further injustice in being told that the only way you can prove that it does not work is to ask by name for something that you do not know exists. That is to say, Mrs Misra did not have the magic word.
In discussions, the Government assert that the harm caused by Horizon was due to the egregious failures of corporate governance at the Post Office. That there has been a historic miscarriage of justice is beyond question, and the outcome is urgently awaited. But the actions of the Post Office were made possible in part because of a flaw in our legal and judicial processes. What happened at the Post Office is not an isolated incident but potentially the tip of an iceberg, where the safety of an unknown number of criminal convictions and civil judgments is called into question.
For example, the Educational Testing Service, an online test commissioned by the Home Office, wrongly determined that 97% of English language students were cheating, a determination that cost the students their right to stay in the UK and/or their ability to graduate, forfeiting thousands of pounds in student fees. The Guardian conducted interviews with dozens of the students, who described the painful consequences. One man was held in UK immigration detention centres for 11 months. Others described being forced into destitution, becoming homeless and reliant on food banks as they attempted to challenge the accusation. Others became depressed and suicidal when confronted with the wasted tuition fees and the difficulty of shaking off an allegation of dishonesty.
The widespread coverage of the Horizon scandal has made many victims of the Home Office scandal renew their efforts to clear their names and seek redress. In another case, at the Princess of Wales Hospital in 2012, nurses were wrongly accused of falsifying patient records because of discrepancies found with computer records. Some of the nurses were subjected to criminal prosecution, suffering years of legal action before the trial collapsed, when it emerged that a visit by an engineer to fix a bug had eradicated all the data that the nurses were accused of failing to gather. That vital piece of information could easily have been discovered and disclosed, if computer evidence was not automatically deemed to be reliable.
It may have already done so, but I will certainly pass that on.
I thank everyone who spoke and the Minister for the offer of a meeting alongside his colleagues from the MoJ. I believe he will have a very busy diary between Committee and Report, based on the number of meetings we have agreed to.
However, I want to be very clear here. We have all recognised that the story of the Post Office sub-postmasters makes this issue clear, but it is not about the sub-postmasters. I commend the Government for what they are doing. We await the inquiry with urgent interest, and I am sure I speak for everyone in wishing the sub-postmasters a fair settlement—that is not in question. What is in question is the fact that we do not have unlimited Lord Arbuthnots to be heroic about all the other things that are about to happen. I took it seriously when he said not one moment longer: it could be tomorrow.
My Lords, I rise somewhat reluctantly to speak to Amendment 291 in my name. It could hardly be more important or necessary, but I am reluctant because I really think that the Minister, alongside his colleagues in DSIT and the Home Office, should have taken this issue up. I am quite taken aback that, despite my repeated efforts with both of those departments, they have not done so.
The purpose of the amendment is simple. It is already illegal in the UK to possess or distribute child sexual abuse material, including AI-generated or computer-generated child sexual abuse material. However, while the content is clearly covered by existing law, the mechanism that enables its creation—the files trained on or trained to create child sexual abuse material—is not. This amendment closes that gap.
Some time ago, I hosted an event at which members of OCCIT—the online child sexual exploitation and abuse covert intelligence team—gave a presentation to parliamentarians. For context, OCCIT is a law enforcement unit of the National Police Chiefs’ Council that uses covert police tactics to track down offender behaviour, with a view to identifying emerging risks in the form of new technologies, behaviours and environments. The presentation its officers gave concerned AI-generated abuse scenarios in virtual reality, and it was absolutely shattering for almost everyone who was present.
A few weeks later, the team contacted me and said that what it had showed then was already out of date. What it was now seeing was being supercharged by the ease with which criminals can train models that, when combined with general-purpose image-creation software, enable those with a sexual interest in children to generate CSAM images and videos at volume and—importantly—to order. Those building and distributing this software were operating with impunity, because current laws are insufficient to enable the police to take action against them.
In the scenarios that they are now facing, a picture of any child can be blended with existing child sexual abuse imagery, pornography or violent sexual scenarios. Images of several children can be honed into a fictitious child and used similarly or, as I will return to in a moment, a picture of an adult can be made to look younger and then used to create child sexual abuse. Among this catalogue of horrors are the made-to-order models trained using images of a child known to the perpetrator—a neighbour’s child or a family member—to create bespoke CSAM content. In short, the police were finding that the scale, sophistication and horror of violent child sexual abuse had hit a new level.
The laws that the police use to enforce against CSAM are Section 1 of the Protection of Children Act 1978 and Section 160 of the Criminal Justice Act 1988, both of which create offences in respect of indecent photographs or pseudophotographs of a child. AI content depicting child sexual abuse in the scenarios that I have just described is also illegal under the law, but creating and distributing the software models needed to generate them is not.
There are many services that allow anyone to take any public image and put it in a false situation. Although I have argued elsewhere that AI images should carry a mark of provenance, these services are not the subject of this amendment. This amendment is laser focused on criminalising AI models that are trained on or trained to create child sexual abuse material. They are specific, specialist and being traded with impunity. These models blend images of children—known children, stock photos, images scraped from social media or synthetic, fabricated AI depictions of children—with existing CSAM or pornography, and they allow paedophiles to generate bespoke CSAM scenarios.
I thank the noble Baroness, Lady Kidron, for tabling Amendment 291, which would create several new criminal offences relating to the use of AI to collect, collate and distribute child abuse images or to possess such images after they have been created. Nobody can dispute the intention behind this amendment.
We recognise the importance of this area. We will continue to assess whether and what new offences are needed to further bolster the legislation relating to child sexual abuse and AI, as part of our wider ongoing review of how our laws need to adapt to AI risks and opportunities. We need to get the answers to these complex questions right, and we need to ensure that we are equipping law enforcement with the capabilities and the powers needed to combat child sexual abuse. Perhaps, when I meet the noble Baroness, Lady Kidron, on the previous group, we can also discuss this important matter.
However, for now, I reassure noble Lords that any child sex abuse material, whether AI generated or not, is already illegal in the UK, as has been said. The criminal law is comprehensive with regard to the production and distribution of this material. For example, it is already an offence to produce, store or share any material that contains or depicts child sexual abuse, regardless of whether the material depicts a real child or not. This prohibition includes AI-generated child sexual abuse material and other pseudo imagery that may have been AI or computer generated.
We are committed to bringing to justice offenders who deliberately misuse AI to generate child sexual abuse material. We demonstrated this as part of the road to the AI Safety Summit, where we secured agreement from NGO, industry and international partners to take action to tackle AI-enabled child sexual abuse. The strongest protections in the Online Safety Act are for children, and all companies in scope of the legislation will need to tackle child sexual abuse material as a priority. Applications that use artificial intelligence will not be exempt and must incorporate robust guard-rails and safety measures to ensure that AI models and technology cannot be manipulated for child sexual abuse purposes.
Furthermore, I reassure noble Lords that the offence of taking, making, distributing and possessing with a view to distribution any indecent photograph or pseudophotograph of a child under the age of 18 carries a maximum sentence of 10 years’ imprisonment. Possession alone of indecent photographs or pseudophotographs of children can carry a maximum sentence of up to five years’ imprisonment.
However, I am not able to accept the amendment, as the current drafting would capture legitimate AI models that have been deliberately misused by offenders without the knowledge or intent of their creators to produce child sexual abuse material. It would also inadvertently criminalise individual users who possess perfectly legal digital files with no criminal intent, due to the fact that they could, when combined, enable the creation of child sexual abuse material.
I therefore ask the noble Baroness to withdraw the amendment, while recognising the strength of feeling and the strong arguments made on this issue and reiterating my offer to meet with her to discuss this ahead of Report.
I do not know how to express in parliamentary terms the depth of my disappointment, so I will leave that. Whoever helped the noble Viscount draft his response should be ashamed. We do not have a comprehensive system and the police do not have the capability; they came to me after months of trying to get the Home Office to act, so that is an untruth: the police do not have the capability.
I remind the noble Viscount that in previous debates his response on the bigger picture of AI has been to wait and see, but this is a here and now problem. As the noble Baroness, Lady Jones, set out, this would give purpose and reason—and here it is in front of us; we can act.