Advanced Research and Invention Agency Bill (Second sitting) Debate

Full Debate: Read Full Debate
Department: Department for Business, Energy and Industrial Strategy
Wednesday 14th April 2021

(3 years ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Stephen Metcalfe Portrait Stephen Metcalfe
- Hansard - - - Excerpts

Q Good afternoon and thank you for joining us and for your excellent contribution. Anne, you made a very interesting point about the independence of ARIA, to avoid it being used potentially as pointing at political failure. If you are investing in high-risk, high-reward research, there will be failure—that is undoubtedly true. May we ask for your advice on how we should measure the metrics of an ARIA over the early years, before potentially there is any output that has demonstrated a transformational benefit to society? On top of that, could you give us some advice on advising project managers on how they should go about selecting projects to explore? Should it be just on the basis of interesting science, or should there be a vision of the commercialisation of that science at the end, to motivate them? We are only going to be able to fund a certain number of projects, and presumably applications will outstrip the funding fairly quickly.

Professor Glover: How we measure success in the early years is a very important question. I am not going to give you an exact answer, but what I might say is that maybe we should not try. That would be unusual, wouldn’t it? That is what I meant earlier about not just following the formula of, “You need to tick these boxes to demonstrate success.” Of course, you would hope that whoever is leading ARIA would have an idea of how you are developing the innovation ecosystem that will be supported by ARIA. They might have some ideas about numbers of applications, where they are coming from, and having a good look at and analysing that, and looking at the amount of interdisciplinary or multidisciplinary research that comes forward. That is always quite hard to fund. Historically, when I have been involved in such things, interdisciplinary research tends to get kicked around different agencies: “This is more for you.” “No, this is more for you.” Everybody is worried about their budget and thinks, “If you fund it, we won’t have to fund this from our budget.” Thinking about the number of applications that could come from a broad range of different disciplines—that would be good. I am not answering your question directly. I am just saying that it is very easy to say, “Let’s have a way of measuring success,” but sometimes that can be stifling.

It is a bit like—perhaps not in the years timescale of ARIA—how it is around the time of year when we plant seeds in our garden or wherever. If you want to measure how well a seed is germinating, if you keep pulling it up and having a look at it you are really going to set it back, so sometimes you just need to think, “I’m hoping that in four or five months’ time this is going to be a broad bean plant with broad beans on it. I just need to wait and see.” I know that that is difficult to do.

The second thing you asked is about commercialisation. I cannot for the life of me remember who said this, but someone once said that there are two types of research: applied research and research not yet applied. That is quite true. There might be some areas where you think that there is a very easy market for this, but if we look back and learn from experience we find that an awful lot of research has been developed. The whole area of medical diagnostics, for example, was pure research. There was no commercialisation; it was just a fundamental biological problem that was being investigated. Some of the outcomes of that research led to molecules called monoclonal antibodies. It is quite a beautiful specific diagnostic—supremely sensitive—that can pick out particular molecules of interest that might tell you if you have a particular disease or have been exposed to a particular compound or whatever.

In renewable energy or an area around that, you might understand that there will be a lot of potential commercial partners and opportunities. In some other areas, perhaps not. This might be an opportunity to think about what the relationships would be like between ARIA and existing research funding, because it might be part of an ecosystem. I would hope that there were distinct roles for UKRI and ARIA but very good communication between the two, as well as very many other stakeholders, in order to identify areas that might not be suitable for UKRI funding but that might have a strong commercial or development potential that ARIA would be much more adept at supporting.

Sarah Owen Portrait Sarah Owen (Luton North) (Lab)
- Hansard - -

Q It is a pleasure to serve under your chairmanship, Mrs Cummins. Anne, you talked about citizen buy-in. That would take an element of trust, so my two questions are around that. What could or would good transparency look like without stifling innovation, in both of your opinions? Secondly, if we do not have FOIs and we do not know precisely how this will be reported to us, do we need an ethical baseline to ensure that we are spending public money on the greater good?

Professor Glover: On the citizen buy-in, I think that would be reasonable to consider achieving. I do not think that it would be insurmountably difficult in many ways. If I give you the example of some of the grand challenges that were funded at European Commission level, it was getting down to three brilliant projects. Which one will we fund? If the European Commission made the decision about which one was going to be funded, inevitably different member states would complain: “Why is that getting funded in that member state? This other project was just as good.”

All sorts of problems can arise. Whereas, if you asked European Union citizens which one they would like to be funded, they would say what matters most to them. That is quite an interesting insight into the mind of the European citizen, or it would have been, in that particular instance.

I do not think you are in any way betraying confidences; you are talking about whether it is a project looking at delivering limitless amounts of sustainable energy, or a project in mapping the functioning of the human brain, so that you might be able to exploit that in other ways. You are not saying how you are going to do those things; you are not revealing confidences or information that would be inappropriate or undermining of those doing the research. I think we might be worrying needlessly about that.

As to the ethical baseline, of course this has to be ethical. Tabitha and I are probably agreeing too much with each other, or perhaps we are going back to the same thing. If you are not open and transparent, you will have problems. That is just not rocket science. For example, there are many agencies that are not part of Government but that might receive governmental funding. Scotland’s National Academy, the Royal Society of Edinburgh, is one of those. We are completely independent from Government. We get funding from the Scottish Funding Council, which gets its money from Government. We are not subject to FOI requests but we voluntarily behave as if we are. If we did not do that, people would say, “They’re being directed by Government, so the reports that come out of the RSE will be influenced by Government.”

If we say, “This is how we approach it,” and if somebody comes to us and asks for information, we behave as if it were an FOI. It has never been too onerous. The only onerous time for me with FOI requests was when I was chief scientific adviser to the President of the European Commission, when it became unrealistic, because I had such a small team and there was such a lot of FOI requests. Generally, that is the direction we should be moving in. You do not want to hobble a new agency by making it seem that any aspect of it is secretive. To be able to demonstrate ethical compliance, you need that transparency.

Tabitha Goldstaub: Ethical transparency is key, but we also have an opportunity with ARIA to set a robust, rigorous ethical review process that is fit for the AI era. We do not currently have that.

There has been a tremendous amount of attention on the public-facing ethical principles and frameworks for assessing AI products, but relatively little on the frameworks and practices for assessing research, or how to launch and manage a data science and AI ethics review board, in any way that would cut across disciplines, organisational, institutional or national boundaries, as ARIA would need to.

If ARIA can work with others, such as the Health Foundation, which is in collaboration with the Ada Lovelace Institute, or the Alan Turing Institute, on this problem, ARIA could achieve its mission responsibly, become a beacon for other ARPA-like programmes, and tolerate failure much more safely; because ultimately we need to break new ground and to do so with an ethics review, specifically with research that has anything to do with artificial intelligence. It would enable us to set real international standards, if we can get that right. It is both a risk and a huge opportunity for ARIA.

None Portrait The Chair
- Hansard -

Virginia Crosbie. I am afraid this will have to be the last, very quick question.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We have one last very quick question from Sarah Owen.

Sarah Owen Portrait Sarah Owen
- Hansard - -

Q Bob, you mentioned engagement and trust. We have heard a lot today about accountability and trust. How do you feel that we can get that trust without stifling innovation, and do you think FOIs are the best way to do that?

Bob Sorrell: If you are to get trust, you need to be transparent about the choices that you are making and how you are making them. Then, when you move to the execution phase, you need to allow the programme managers and the people who are driving the programme scenario to make the choices flexibly and in the quickest way possible. I understand in part what you are perhaps playing into, but I think you just need to strike the right balance between transparency on how choices are made and holding to account on that, and allowing people to get on with executing against those programmes once those choices have been made.

David Cleevely: I think the acid test is whether you can explain something to someone who is independent and is one of your peers. If you are happy explaining it back to somebody like that, that is fine. That is the way in which the system works. If you listened to Peter Highnam talk about how DARPA was organised, that was built into the DNA.

Sarah Owen Portrait Sarah Owen
- Hansard - -

Q Do you think it would be useful to have that built into ARIA from its inception?

David Cleevely: I think it is essential. I would be very uncomfortable if you had an agency that did not have some degree of—accountability is not exactly the way to describe it, but you have to have a group of independent people reviewing what you are doing, not quite in the same way as you would do an audit, but it is basically that kind of principle. If I have to explain something, as I am having to do for this Committee, it is a lot clearer and more straightforward, and I feel a lot more comfortable about the way in which I can rely on the ideas and what I am doing. I think that process is very, very important.

None Portrait The Chair
- Hansard -

If there are no further questions from Members, I thank the witnesses for their evidence. The Committee will meet again on Tuesday at 9.25 am to begin line-by-line consideration of the Bill.

Ordered, That further consideration be now adjourned. —(Michael Tomlinson.)