Generative Artificial Intelligence: Schools

Debate between Damian Hinds and Jeremy Wright
Tuesday 8th July 2025

(6 days, 6 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Jeremy Wright Portrait Sir Jeremy Wright (in the Chair)
- Hansard - - - Excerpts

Order. That was either several interventions or a speech, neither of which is permissible. I urge all participants to keep interventions brief.

Damian Hinds Portrait Damian Hinds
- Hansard - -

The hon. Member for Mansfield (Steve Yemm) should not misunderstand me, as I am not against regulation. His points about data protection and privacy are really important, although they are probably too big to fold entirely into this debate. His first group of points and what the NSPCC talks about are the same risks that I am talking about.

There is an even broader point, as there is already a lot of blurring between fact, fiction and opinion online. There are all manner of news sources and influencers, network gaming, virtual reality and augmented reality, the metaverse—the whole concept of reality is a little hazier than it once was. With these machines, which in some cases almost seem to have a personality of their own, there is a danger of yet more blurring.

We all shout at our PCs sometimes. Indeed, adults using AI may start to give human form, which is called anthropomorphism, to the machine they are interacting with—I occasionally try to be polite when I interact with one of these interfaces. Apps such as character.ai take that to another level.

We have to think about the impact on children in their most formative years—on their sense of self, their understanding of the world and their mental wellbeing. That includes the very youngest children, who will be growing up in a world of the internet of things and connected toys. It will be that much more important to draw a line between what is real, what is human, and what is not. In time, when the system has had enough time to think about it—we are not nearly there yet—that may be yet another area for regulation.

Finally, I come to the most immediate risks, around homework, assessments and exams. Colleagues may already have had a conversation in which a teacher has said, “Isn’t it brilliant how much so-and-so has improved? Oh, hang on—have they?” They now cannot be absolutely certain. There are AI detectors, but they are not perfect. They can produce false positives. In other words, they can accuse people of plagiarising using AI when they are not. In any event, there is an arms race between the AI machine and the AI detector machine, which is deeply unsatisfactory. Of course, that is where the teacher’s skill comes in, because there is always classwork to compare. Most importantly, there is always the exam itself, and we need to keep it that way.

The safest way to protect the integrity of exams is for them to be handwritten in exam conditions, with a teacher walking up and down between the desks—not quite for everybody, but for the vast majority of children, except where a special educational need or disability requires another arrangement. There are also subjects, such as art, design and technology and computer science, where it would not be appropriate.

There is already a big increase in access arrangements for exams. A particular type of adjustment, called a centre-delegated arrangement, does not need approval from the exam board, so no data on it is available. One such centre-delegated arrangement is to allow the child to use a keyboard—in the rubric it is called a word processor, which is a delightfully archaic term.

If children are allowed to use a keyboard, spellcheck and AutoText are disabled, to ensure safeguards are in place—but it is still true that most people can type faster than they can write, so there is a disparity in the two formats. The regulations require a school’s special educational needs co-ordinator to decide whether a child is able to use that facility, but they are still quite loose in that they refer to the keyboard being the child’s

“normal way of working at school”.

I would love the Minister to say a word about that. The Department for Education should be clear that, where such arrangements are made, it should be because of a special educational need or disability.

--- Later in debate ---
Damian Hinds Portrait Damian Hinds
- Hansard - -

There are two simple safeguards against misuse of AI in exams here in front of me. Will the Minister recognise that the best way to ensure the security and integrity of exams, and how assessment is done lower down the school, is—for the great majority of children, in the majority of subjects—for exams to be handwritten in exam conditions?

Jeremy Wright Portrait Sir Jeremy Wright (in the Chair)
- Hansard - - - Excerpts

For the assistance of Hansard, I point out that the right hon. Gentleman was holding up a pen and paper.

National Security Bill

Debate between Damian Hinds and Jeremy Wright
2nd reading
Monday 6th June 2022

(3 years, 1 month ago)

Commons Chamber
Read Full debate National Security Act 2023 View all National Security Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Damian Hinds Portrait Damian Hinds
- Hansard - -

I will come back to my right hon. Friend’s point in a moment. To the point that the right hon. Member for Dundee East (Stewart Hosie) made, our position is that a public interest defence is just not the safest and best way for people to make disclosures, for some of the reasons I gave a moment ago.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

Will the Minister give way?

Damian Hinds Portrait Damian Hinds
- Hansard - -

If my right hon. and learned Friend will forgive me, I will not.

The existence of a public interest defence could mean that damage from the original disclosure could be compounded by further disclosures that had to be made to argue against and defeat that use of the public interest defence. That could itself then in turn be misused and mean that in some circumstances, even where there were egregious breaches of the law, in effect they could not be prosecuted. That is why, to respond to the point made by the right hon. Member for Dundee East, it is important that we look at the safe and proper channels and methods for making disclosures, where that is important, and there are times when it is. We are looking carefully at that.

To come back to my right hon. Friend the Member for South Holland and The Deepings—this is an important point in general—the defences in part 1 of the Bill provide law enforcement with several options for prosecuting disclosures where the person is acting for or on behalf of a foreign power or where the disclosure would materially assist a foreign intelligence service. That can include bulk disclosures. To be clear, with this Bill, the maximum sentence for an indiscriminate disclosure—a bulk data dump—will be higher than it is today if that act is done for a foreign power or the disclosure would materially assist a foreign intelligence service, even if not procured by that foreign intelligence service itself.

--- Later in debate ---
Damian Hinds Portrait Damian Hinds
- Hansard - -

I do not think that this is an appropriate forum in which to discuss the detail of such measures, but I hope I can reassure my hon. Friend on that particular point. As I have said, this is to allow for cases in which such capacity is required owing to operational need, and it cannot be outside the United Kingdom.

A number of Members on both sides of the House have referred to the so-called STPIMs. These are a tool of last resort to prevent, restrict and disrupt an individual’s involvement in state threats activity. In the most serious cases, that could include restricting where an individual can reside, whom they can associate with, and where they can work and study. An STPIM will be used when intelligence exists to confirm that highly damaging threat activity is planned or being undertaken but prosecution is not realistic. As my hon. Friend said, with such measures it is extremely important to have the appropriate safeguards.

I want to reassure the hon. Member for Cumbernauld, Kilsyth and Kirkintilloch East (Stuart C. McDonald) that STPIMs will not be imposed through ministerial decision making alone. There will be a process through the courts. A decision by the Secretary of State to impose an STPIM, once they are satisfied that the five conditions set out have been met, will be referred to a judge, and the court’s permission will be sought before an order can be made. The court is specifically tasked with checking that the ministerial decision is not flawed.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) and others spoke about civil legal aid for terrorists. Through the Bill, we will take action to restrict access to civil legal aid in England and Wales for individuals convicted of terrorism or terrorism-connected offences since 2001. However, I can assure my right hon. and learned Friend, my hon. Friend the Member for Bromley and Chislehurst (Sir Robert Neill), the hon. Member for Garston and Halewood (Maria Eagle) and others who have spoken about this that the restriction of access of civil legal aid applies only to offences involving a sentence of more than two years. In any event, all individuals subject to the restriction can apply for exceptional case funding, and applications will be assessed according to the legislative framework of whether an individual’s human rights may be breached without legal aid. The type of terrorism offence that had been committed would not have bearing on the exceptional case funding decision.

I need to spend a couple of minutes going through the amendments to the Serious Crime Act 2007, an important subject that a number of colleagues have brought up, including my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and my hon. Friend the Member for Wycombe. The context, of course, is that our intelligence and security services and armed forces do and must work in close partnership with international partners to maximise UK capabilities and their ability to protect national security on our behalf. A key part of that is sharing intelligence and data to support joint objectives.

However, it is possible that such intelligence, when shared in good faith and in accordance with all domestic and international law, could still be capable of contributing, even in a very small or indirect way that was not intended at the time it was shared, to an international partner’s engaging in activity that the UK would not support. The Serious Crime Act 2007 creates an offence where an act is done that is

“capable of encouraging or assisting…an offence”.

That means that in this scenario there is a risk of individuals facing criminal liability, even when they have operated in good faith and in accordance with the guidance and proper authorisation.

Put simply, the Government believe it is not fair to expect the liability for that unforeseen eventuality to sit with an individual officer of our intelligence services or member of the armed forces who is acting with wholly legitimate intentions. Instead, the liability should sit with the UK intelligence community and the military at an institutional level, where they are subject to executive, judicial and parliamentary oversight. The amendment at clause 23 therefore removes that liability for individuals, but specifically only where the activity is necessary for the proper exercise of the functions of the security and intelligence services or the armed forces. It does not remove liability at an institutional level for any activity.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

As my right hon. Friend knows, I think there is no dispute across the House that some protection should be available for individuals in those circumstances. The question we have been asking is how different what clause 23 provides for is from what already exists in law. Clause 23 will ask for consideration to be given of whether there has been a proper exercise of a function. That must logically, therefore, relate to the behaviour of an individual, must it not?

Damian Hinds Portrait Damian Hinds
- Hansard - -

My right hon. and learned Friend anticipates my next point to some extent. In instances where an individual has operated in good faith in compliance with domestic and international law and all proper process, they would then not face the risk of liability under the 2007 Act for something they could not have foreseen. In effect, we are adding greater certainty and specificity to an existing defence—the reasonable defence contained within that Act—by detailing scenarios where the offence will not apply, whereas the current defence is untested and imprecise.

The amendment means that, where an individual is working properly on behalf of our intelligence and security services and armed forces with an international partner to protect national security, they do not personally risk criminal liability if their work is later found to have been capable of contributing to unlawful activity in a way they would not have intended. That risk should remain with the Government, the services and the armed forces at corporate level, and that is what this amendment seeks to ensure.

A number of colleagues have raised the question of disinformation. They are correct that information operations are now a firm feature in the set of devices available to hostile states. There is direct disinformation, where talking points are put out on those states, on foreign affairs or on our domestic politics and society, but there is also the terrible technique of indirect disinformation, which is not necessarily intended to make anybody believe a particular line or narrative, but is simply aimed at causing division and discord in our country, to undermine our democracy and the cohesion of our society.

This Bill deals with people who carry out disinformation for a foreign state, but I want to be clear that legislation on the material itself belongs in the Online Safety Bill. We are looking at how to amend that Bill to account for disinformation material where that disinformation amounts to foreign interference, so that it can be treated as illegal material.