National Security Bill (Second sitting) Debate
Full Debate: Read Full DebateSally-Ann Hart
Main Page: Sally-Ann Hart (Conservative - Hastings and Rye)Department Debates - View all Sally-Ann Hart's debates with the Home Office
(2 years, 4 months ago)
Public Bill CommitteesQ
Sam Armstrong: The Australian scheme is by far and away the best example—in my view, the US FARA system is not a good comparator—and it is a shame that we have not taken the opportunity to bring it in sooner. The Australian high commissioner in London was George Brandis, who was the Attorney General who wrote that very Bill, and I know he was keen wherever possible to impress on the Government that he was there and ready to help. I am sure that offer has not dissipated.
Q
“a person commits an offence if…the person engages in conduct intending that the conduct, or a course of conduct”
and
“the foreign power condition is met…if… the person knows, or ought reasonably to know, that”
it is a foreign power. Do you think that should be widened to include an element of recklessness or recklessness?
Carl Miller: I think doing anything that might compel any of the services involved to do any kind of due diligence on the people who are employing them can only be a good thing, although the general point I am making is that I don’t think criminalising activity within domestic legislation has been a particularly effective way of changing what people do on the internet, especially when those people are largely concentrated in jurisdictions that do not have any co-operative relationship with British law enforcement.
I remember I spent time with a number of cyber-crime teams across the UK and, in the words of one cyber-crime police officer, “If you are in Russia, the cost or penalty of doing cyber-crimes against British citizens is basically nil.” This is not going to be an effective way of reaching beyond our borders and addressing where we believe a large number of actors doing this kind of thing are; they are not doing this from the UK.
Q
Carl Miller: Sure. First, we need to change the intelligence picture slightly. We should integrate SOCMINT—social media intelligence—within the national strategic intelligence picture. We overlooked open-source intelligence—
But that is not to do with this Bill, is it?
Carl Miller: Sorry, I thought you asked me— Would you like to hear what I think?
Yes, carry on.
Carl Miller: Partly it is to do with changing our national knowledge of where these threats are and who is doing them, so the integration of intelligence. Then, as I said, there should be a national risk register and possibly the creation of powers for parts of the intelligence establishment to undertake direct activity against some of the technical architectures that allow this to happen.
Sorry to delve into the technicalities for a second, but for instance residential proxy IP addresses are a very important way in which this stuff happens. Residential proxy IPs are toasters and fridges and stuff. Basically, they each have an IP address and many of them are hijacked. They are the kind of things you that you use if you want to fool a social media platform into thinking that you are 10,000 people from around the planet when you are not—you are one operator sitting in a particular country. These are criminal architectures that have been amassed and rented out and sold to people, and I am sure they are rented out by some of the actors who seek to do influence operations. These are the kinds of things that we need to target. Putting pressure on that kind of asset is the kind of thing that will probably not get rid of them, but will meaningfully increase the costs of this kind of activity.
Q
Sam Armstrong: Yes, I think so. Imposing a duty on the social media companies is one of the only immediate tools and levers we can pull. I take Carl’s point; I do not think it is going to be sufficient to deal with the hordes of people overseas who are, frankly, conducting quasi-military-type activities against the UK through cyber means here, because criminal law is not the tool for that. Should they exist and are they necessary? Yes. Are they sufficient? Probably not.
Carl Miller: It is just massively insufficient. The reason why is that the platforms, however rich, clever or large they are, cannot reach beyond the platforms themselves. That is the problem. The way we have tried to respond to this problem so far is to have Facebook take down accounts, but take-down is a very weak response. That is essentially being priced in to those kinds of activities. They have developed methodologies for setting up or acquiring new accounts as they go. In principle, I am not hostile to platform regulation across a range of online threats, but for those problems where we are dealing with a set number of actors who have specific capabilities and tap into a specific and constantly evolving tradecraft, I do not think it is going to be the tool to make much difference.
Q
Professor Ciaran Martin: I do not mean to be flippant, but obviously there could be as many different opinions as there are academics. I think that Government providing clear frameworks, laws and guidance to universities without infringing on academic freedom is where I would want to be. I do not think that it is fair to rely on universities to police this activity. It is extremely difficult in open and collaborative research environments like universities to be able to identify what is malevolent activity. If they do, it is extremely difficult to know where to go, what the relevant laws are, and so forth. The combination of a clear legal framework and clear guidance to universities is something that I personally would welcome. I imagine quite a few people, particularly in sensitive areas like technological research, would absolutely welcome that.
Q
Professor Ciaran Martin: They are not mutually exclusive. The thing about offensive capabilities is that they are sometimes seen as almost symmetrical—cyber is a sort of enclosed boxing ring, where you have offence versus defence—but offensive cyber can be used for anything. Our own British Government’s one declared offensive cyber-operation was against so-called Islamic State, not against the cyber-capabilities of another state.
I need to be reasonably careful about what I say here, but if you think that the US’s offensive cyber-capabilities are largely in the Cyber Command and the UK’s in the National Cyber Force, the GCHQ-MI6-Ministry of Defence partnership, one would expect that the operational security of those capabilities to be pretty good and therefore make quite hard targets for other actors. Similarly, some of China and Russia’s offensive cyber-capabilities against us will have quite good operational security, which will make them hard targets. We cannot rely on offensive cyber-capabilities to stop other people, particularly at the top end of the spectrum, at the elite nation- state level.
There is no magic panacea in the Bill, because no magic panacea is available. Even in the areas we were talking about, such as completely remote activity, one of the things that we saw anecdotally—there is some emerging research to support this—was that when the US in particular had a legal framework, where it can prosecute and indict people in absentia, in China and to some extent Iran, that did have some impact for some time. It did not solve everything, but it did affect the behaviour of some actors—they could not travel to the west, most practically, because they were under indictment by the US and therefore all the US’s allies. It meant that the associates of these people, because digital infrastructure is global, could get arrested.
Some people working with Russian groups have been arrested in eastern European countries with which we can co-operate in law enforcement terms. Strengthening that sort of legal framework gives you something. It is probably more incremental than transformative, but it is still something.
Q
Rich Owen: Yes. Well, we are looking for something similar to the Australian scheme. The Australian legislation specifically exempts legal professional privilege, as well as seeking legal advice and assistance. That sort of model, which expressly exempts legal professional privilege, would be a suitable way forward for the scheme.
Q
Dr Nicholas Hoggard: You can, although I am afraid I will have to be very boring. Speaking with my Law Commission hat on, we are limited in what we can say with respect to those things that did not form part of the scope, regarding the protection of Government data. I am very sorry; I do not mean to be deliberately unhelpful, but we do not really—
Q
Rich Owen: Well, those provisions are modelled on terrorism legislation, when they concern a serious risk to the public, and there are suitable safeguards attached to them as well, so the position of the Law Society is to regard that provision as proportionate.
Q
Rich Owen: I was saying that an exemption on grounds of legal professional privilege, or seeking legal advice and assistance, could not be used for espionage, because you are outwith legal professional privilege. You are seeking to advance a crime, so that does not come within the ambit of legal professional privilege.
Q
Rich Owen: Yes. There has to be access to justice for everyone, including rich people. They can communicate with their lawyer, and if they need advice on the law, that should be privileged. However, if they are seeking, through their communication with lawyers, to advance a criminal offence, then that is outwith legal professional privilege.
Q
Poppy Wood: The role of whistleblowers in society is really important. I know the Government understand that. There are some good recommendations from the ISC about whistleblowers that I do not think have been adopted in this version of the Bill. That is about at least giving some clarity to where the thresholds lie, and giving a disclosure offence and a public interest defence to whistleblowers so they can say, “These are the reasons why.” My understanding is that at the moment it sits with juries and it is on a case-by-case basis. I would certainly commend to you the recommendations from the ISC.
I would also say—this was a recommendation from the Law Commission and also, I think, from the ISC—that lots of people have to blow the whistle because they feel that they do not have anywhere else to go. There could be formal procedures—an independent person or body or office to go to when you are in intelligence agencies, or government in general or anywhere. One of the reasons why Frances Haugen came forward—she has been public about this—is that she did not really know where else to go. There were no placards saying, “Call the Information Commissioner in the UK if you have concerns about data.” People do not know where to go.
Getting touchpoints earlier down the chain so that people do not respond in desperation in the way we have seen in the past would be a good recommendation to take forward. Whistleblowers play an important part in our society and in societies all round the world. Those tests on a public interest defence would give some clarity, which would be really welcome. Building a system around them—I know the US intelligence services do that; they have a kind of whistleblower programme within the CIA and the Department of Defence that allows people to go to someone, somewhere, earlier on, to raise concerns—is the sort of thing you might be looking at. I think a whistleblower programme is an ISC recommendation, but it is certainly a Law Commission recommendation.
Q
Poppy Wood: I have certainly read and heard concerns about journalism, about the “foreign power” test on civil society and about having Government money being quite a blunt measure for whether or not you might fall foul of these offences. On journalism, I think that is why you should never try to define disinformation: because those kinds of shape-shifting forms are very hard to pin down, particularly with questions like “What is journalism?”, “What is a mistruth?”, “What is a mis-speak?” and so on. We need to be careful about that.
On your specific question, I refer you to Article 19 and others who have really thought through the impact on journalism and free speech. I am sure it would be an unintended consequence but, again, we are seeing Russia using its co-ordinated armies on Telegram and other channels to target Ukrainian journalists. They are saying, “Complain to the platforms that the journalist is not who they say they are or is saying something false, so they are breaking the terms of service. Bombard the platforms so that that journalist gets taken down and cannot post live from Ukraine for a handful of days.”
That is just another example of how these systems are weaponised. This is where you can go much further on systems through the Online Safety Bill and the National Security Bill without worrying too much about speech. But I refer the Committee to other experts, such as Article 19, that have looked really deeply at the journalism issue. I think Index on Censorship may have done some work as well.
Q
Poppy Wood: I think that where we are now is much better than where we were last year, but my concern is whether this will all be law when we have an election. If not, what are the backstops that the Government have in place to focus on this stuff? It will get tested only when we have an election, really. If that is before March next year or whenever these laws get Royal Assent, there will be a genuine question of crisis management: if this is not law, what are we doing? I would ask that question of the Government and the civil service.
As I said, the disinformation committee in the Online Safety Bill is years down the line. Bring that forward—there is no need not to bring it forward—and please make sure that it is not chaired by someone from a tech platform. I would write that into the Bill, because otherwise there is a risk that that will happen.
Q
Poppy Wood: Why should the committee on disinformation not be chaired by someone from a tech platform? They have a vested interest in this stuff, so I would get an academic or someone from civil society—someone at arm’s length who can take a holistic view. These platforms will want to protect their interests on this stuff, so I would warn against that.
I would like to see the transparency provisions in the Online Safety Bill go much further. This is a bit in the weeds of the Online Safety Bill, if you will forgive me, but there is a very good clause in that Bill, clause 136, which says that Ofcom should ask whether researchers should be given access to data. It is an important clause, but it says, “Ask the question,” and it gives Ofcom two years to do it. I do not think it needs two years; I think we know that the answer is “Yes, researchers desperately need access to data.”
Almost all the stuff that is caught about malign information operations is caught via Twitter’s API. Twitter makes 10% of all the tweets public, and researchers use that to run analysis, so if you ever want to do research on disinformation, you always use the Twitter API. In many cases, that is mapped over to Facebook to identify the same operations on Facebook, but they are always caught in the first instance because of open data. I think that the Online Safety Bill, if this Committee and this Bill want to back it up, could bring that forward and say, “Either do the report in six months or don’t even ask the question.”
By the way, the European legislation that is equivalent to the Online Safety Bill makes that happen as of Tuesday this week, so researchers should, in theory, be able to access data. I would bring the transparency provisions forward, and I would really want the Bill to call out co-ordinated inauthentic behaviour.
That brings us to the end of this panel. On behalf of the Committee, I thank our witness for taking the time to give evidence.
Examination of Witness
Dan Dolan gave evidence.
Q
Dan Dolan: I am afraid I might have to give the frustrating answer that our evidence does not cover clause 20. There is clearly a concern there, but I am probably best leaving that to more expert witnesses to answer.