Thank you very much. I know that others have questions to ask so I will leave it there, but I just want to say how inspiring it is to hear such positive reference to the power of public service, science and research, and to oversight as being an enabler rather than a burden.
Q
I am going to start with Dr Peter Highnam. How do you ensure evaluation and scrutiny of DARPA’s programmes outside what is mandated in legislation? What information do you gather to assess when to start and stop projects and programmes, and how are these decisions made?
Dr Highnam: That is a surprisingly big question. The p in DARPA stands for “projects”, which is critical for a place like DARPA. We are not doing technology area x or y just because, and we do not do it for the long term. We have projects that are well defined at the beginning. A case has to be made. They are monitored, they have metrics and all manner of independent evaluation associated with them before we go out to find the best teams we can to participate and to be funded to work on that research. Then that project ends. That is very important: things begin, and they end.
To make the case for a project to get off the ground, we use a structure called the Heilmeier questions, named after the DARPA director in the mid-70s, George Heilmeier. They are five very important questions. They look easy, but they are very hard to answer well. In my view, that is the creative act in the DARPA model—to answer those questions well and make that case. Once the project is approved and teams are onboard, you then have regular evaluations. As things change in the world around us, in science and technology, with us in defence, and in other aspects of our environment, they may be overtaken by events. That is very rare, but it would be grounds for no longer continuing. Were we too ambitious in certain aspects of the programme? Do we need to change it or change some of the people participating in the teams? And so on.
This is a constant process. It is not about starting it up and letting it run until it finishes. It takes a lot of effort to make sure you know what you are doing when you start with taxpayer funding and the opportunity cost that comes with that. Then you keep an eye on it, especially during the transition of the results to our national defence.
Q
Dr Dugan: The story of Wellcome Leap actually dates back to about 2018, when the Wellcome Trust, from its unique position in the world, asked, “Is there more we could do to have greater impact?” It did a pretty careful analysis of innovation as it happened in larger organisations in the venture world and also at DARPA. The assessment was that in global human health, there is indeed this innovation gap. That innovation gap is characterised by larger programmes with higher risk tolerance, which are not driven by consensus peer review. This is very much the way we conduct programmes at DARPA—the intersection of a goal and the science and engineering that need to be pulled forward in order to attain that goal. That effort—those large programmes—are what Wellcome sought in the formation of Wellcome Leap. What I have observed in the last year of operation is that, in fact, there is this innovation gap in human health. It is same one that was identified after Sputnik that led to the formation of DARPA. The coronavirus is showing us just how much work needs to be done in human health across policy, equity and the economics, but it also shows us the power of a breakthrough and how tough it is to get one.
I was the director of DARPA when the pivotal investments in mRNA vaccines were made. Many others came to the table to create this success for the world in this time, but we need more breakthroughs like that, and we need them faster. That is why Wellcome Leap was formed.
Q
Professor Azoulay: Absolutely, it is essential and I think it happens at multiple levels. It happens in the relative administrative autonomy that those ARPA-like agencies have, relative to their Government Departments of reference, whether it is the Department of Energy for ARPA-E or the Department of Defense for DARPA.
It definitely also happens at the hiring level and in the fact that one can hire programme managers in ARPA-like agencies from very diverse backgrounds, not necessarily a background in the civil service, and pay them according to rules that might not be those of the traditional civil service.
Focusing on programme managers, that matters because they themselves have quite a bit of autonomy in the way in which they delineate and orchestrate their programme. They have a lot more discretion in choosing what projects to fund and assembling the teams that will perform those projects than would be the case in a traditional science funding agency, such as the National Institutes of Health or the National Science Foundation in the United States context or, I would think, UK Research and Innovation in the British context.
Q
Professor Glover: Just for easiness, can I ask Committee members to just call me Anne? Otherwise it is a bit of a mouthful.
On the idea of five or six individuals, I would caution on that slightly. I am partly bought into the idea, but if you are identifying five or six individuals, you have already pinned your colours to the mast in what you want. You have already prejudged the areas you want to work in or the ideas that you are interested in.
Where the five or six people might be really important to identify is for the running of ARIA itself. Whether it is the overall director of ARIA or the research leaders in the different themes that might be funded in ARIA, they will be key people and they need to be credible, trusted, very effective at communication and really open-minded. In my view, a large part of the success of ARIA will come from having quite inspirational leaders throughout.
In terms of how you fund and who it is that you are funding, I would go back to what I was alluding to earlier. There needs to be a big conversation about this. There are often older men who have got a reputation in research, so they are naturally the ones we go to, but as I know from bitter experience, as you get older, sometimes your thinking closes off in particular areas and you are less open to ideas. I am thinking of Professor Donald Braben, whose comments the Committee would probably be very interested in. He set up a venture research unit in BP, back in the ’90s I think, and has written several books about this kind of blue skies research area.
What Braben said is that we should look for “irreverent researchers and liberated universities”. Do not look for people who have a research area that we think is really important and we must go there. Debate widely among researchers, of course, but also Government Departments, devolved Administrations, foresighters, businesses, citizens. Let us imagine the future. ARIA could be the stepping stone, if you like, to inventing that imagined future. For a future to exist, you have to imagine it in the first place and you have to convert it into what you would like. There are lots of different ways of doing that. With inspirational leadership, you can move towards that. You can probably increase dramatically your chance of getting it right by having an irreverence around what you do, and not the usual measures of success.
Q
Professor Glover: I would argue that there is huge value in that. Obviously, the funding is coming from Government, but by giving it freedom from Government you might also be giving it the freedom to fail in many ways, and that is exceptionally important. If it is seen as very close to Government—whichever Government is in power—it potentially becomes a bit like a political football, either in what is being funded or in the direction suggested for where ARIA funding should go.
If there are notable failures of funding, which you would expect if it were a high-risk, high-reward funding agency, political opponents will also say, “Well, look, this is a complete disaster under your custodianship. Here are all the failures.” You just want it to be separate from that. It is also part of trying to embrace the unthinkable, if you like, in terms of the research we do and the areas we go into. Necessarily, those will sometimes be difficult areas, and not ones that you should expose Government to either. In the spirit of opening everything up, I would say that keeping that independence is extremely valuable.
Tabitha Goldstaub: I totally agree with what Anne just said—I would have said exactly the same thing. I think that the separateness and independence are really vital to the success of ARIA. The only thing that I would really think about adding here is how important it is that ARIA does have a relationship with Government, because it will need to have many customers, both private sector and public sector. The programme managers will need to create those bonds with central Government Departments individually.
I think that a commitment from Government to remain independent but to become good customers is very important. The health and transport sectors are good examples of where that might work. What is different is that a surprising number of these next big scientific fields, and these next big breakthroughs, such as artificial intelligence, are going to depend on systemic transformation, where you cannot separate the technology from the policy and regulation.
So yes, ARIA has to be independent, but it also needs to ensure that it works really closely with central Government and with regional and local government. Local government spends about £1 billion on procurement, and cities are key investors in infrastructure, so finding a good link with local government, as well as with central Government, is important. This will hopefully end up creating, as Anne suggested, a way that people feel part of this. Regional strengths deliver benefits to actual localities. Even if it is within the next 10, 15 or 20 years, it is really important that government feels part of that, even though ARIA is independent.
Q
On independence from Government, from looking at your bio, Anne, I can see that you have worked for a few public agencies. If ARIA does not have the public contract regulations and freedom of information in place, will that free it to do what it needs to do? Should we see that as a positive as opposed to a check imbalance, given that we are referring to public money?
Professor Glover: I will deal with that point first—it is an exceptionally interesting point. Initially, when I saw that it might not be subject to FOI, I was thinking, “What are the pros and cons of that?” There is one thing that needs to be fundamental in ARIA, and that is an openness and transparency about what it is funding and why, and how it is doing it. For most things—UKRI would be similar to this—what you provide information on obviously cannot be something that would break the General Data Protection Regulation or that would be commercially sensitive. That should hold exactly true for ARIA as well.
There needs to be some thinking around the whole aspect of openness and transparency, because that brings along with it trust and engagement. If there were any suggestion that Government funding was going into ARIA and it was being syphoned off into particular areas, and we could not find out what those areas were, there would be nervousness. People would, quite rightly, object to that, so there would have to be some greater thought given to how the agency is able to be open and transparent. It might be writing its own rulebook in that area, about what it will provide information on and what it should not.
On whether £800 million is enough, you are asking a scientist and a researcher here, so no, it is never going to be enough, but we have to start somewhere. I cannot make a direct comparison with DARPA’s funding, which is about $3.5 billion or $4 billion per annum, but I might be a bit out of date on that. It does not seem unreasonable to me to start at that level of funding and to start off on the journey to see what is and is not working, where there is greater demand and where you might need more funding to meet it. What you would want to see is that this was such a success that there was substantial demand for funding.
On the other hand, you do not want to get into the situation that standard research funding has—I have certainly visited it many times during its lifetime—where you are putting in 10 research proposals to get one funded. That is an enormous waste of everybody’s time, including the agency that is funding the research. There needs to be a balance between how much money is available and what you hope to do with it.
The last thing I would say is that how that funding is apportioned needs to be carefully thought out, because there needs to be some security of funding. Traditionally in the UK, we have normally had three-year tranches of funding. Long before the end of the three years you have to try to think about how you get continuation of funding. You might hope that ARIA could look at a different model of funding, which might span different timescales depending on what the nature of the project was.
Many projects, particularly ones that are quite disruptive in thinking, will not deliver in a short period—two or three years—of time. Some could do, but some will not, so there needs to be that security of funding over different annual budgets to allow the investment over a period of time.
Tabitha Goldstaub: I will start with the amount of funding. I see the £800 million as just a start. I think that £800 million is sufficient as long as ARIA works in partnership with Government Departments, the private sector and other grant makers. ARIA should not be restricted in matching or exceeding the Government funding with funding from the private sector. There are people in the community that I have spoken to who think that for true intellectual and financial freedom, ARIA should be able to more than double the Government funding. It was good to see in the Bill that the potential for ARIA to take equity stakes in companies and start-ups in a venture fashion could lead to increasing that part over time and making more funding decisions. I see the £800 million as really just a starting point.
On freedom of information, I agree with Anne that openness is key. Transparency fosters trust, and I do not think there is any need to stop freedom of information. We need to keep freedom of information to help with the efforts for connectivity. If the community are going to feel part of ARIA and will it to do good things, they need to be able to use freedom of information. I cannot see any argument against this for the administration costs. Earlier this morning, we heard Ottoline Leyser say that UKRI gets 30 requests a month. If ARIA is 1% of the budget of UKRI, perhaps it could get 1% of the requests, which would be fewer than four a year. I cannot see it, for that reason.
The other reason why there is a desire for secrecy and no FOI is that people traditionally are not comfortable to innovate and fast fail in the open, but that is changing. DeepMind has teams. I have spoken to Sarah Hunter, who is at Google’s moonshot factory, X. She explained how they started in secret and everything felt so appealing, to protect people from any feeling of failure, but what they learned is that there are so many other much better ways than secrecy to incentivise people and to give them the freedom to fail. Actually, allowing for more transparency builds much more trust and encourages more collaboration and, therefore, better breakthroughs.
Anne has spoken about the community. I definitely will speak again about the community, but in addition to the community engagement, ARIA will need to have a press department and media engagement teams that are separate from BEIS, separate from the grid and separate from the Government, to enable it to be agile in its communication and foster a two-way conversation. In order to answer your question, I really think this is the key point: openness and transparency create more trust and more breakthroughs.
Q
Felicity Burch: As I know you are aware, I think having a long-term approach to funding R&D matters hugely. From the perspective of the business community, having institutions that are in it for the long run that they know they can come back to and that they are aware exist is really important for their own confidence to invest.
Thinking about the agency slightly more specifically, when it comes to its own patience, one of the things that CBI members have highlighted to me as a particular benefit of the DARPA model is the commitment to funding their programmes for significant periods of time. For example, there might be 10-year funding with three-year gates to check if the project is working. Those commitments, with that 10-year view—so long as everything is going more or less according to plan—is hugely important for bringing business funding alongside that. So if we can bake a long-term view and patience into ARIA from the start, it will certainly help it to be successful.
Thank you. Sir Jim?
Professor McDonald: It is nice to see you, Minister. There is a requirement here to have a significant cultural change—that is embedded in your question—to move away from the value-for-money concept that is deeply embedded in the UK Research and Innovation funding structure. That is important, but of course we would need to innovate the funding model, which is what is being sought here. Value-for-money assessments for disruptive innovation may not be assessed, as you indicated, until decades later, so we will need a longer-term outlook or alternative approaches to assessing value, such as a means of building capability and capacity in both technology and skills.
Of course, projects that were deemed unsuccessful in achieving their goal may produce value in terms of people, skills and lessons learned, so we must take a long-term view. I think we see that notion of patience, but it is about the ability to have that highly driven, focused approach that the executive officers and the board of ARIA will take and—we may come on to this—the ability to fail fast and elegantly and not be punished for failure as long as the process has been driven openly, transparently and with excellence underneath it.
I would say, absolutely long-term vision and drive forward. If everything worked and everything was successful, we should challenge ourselves and think maybe the questions were not quite as challenging as we thought they might be. Failure is not something we should be discouraging—it is about risk and collaborative approaches to driving problems to a solution—but long-term vision is absolutely essential. That is why, as you have heard from Adrian and Felicity, that patience and that long-term view is key. It should become a very natural part of the UK landscape, so that it is something that we boast about and that acts as an attractor for business and investment, and to attract and retain talent.
Adrian Smith: Let me echo everything that Jim said. The scale of mission that we would hope to see from such an agency means that the timescales will be long and we will need to build new research capability over those timescales, in so far as we are interacting with technologies, and perhaps new supply chains. If those are to come out of the woodwork, they need to believe that we are in it for the long term and that there is patience on the part of the funders and others. The timescales are really important, not just in terms of if it is a hard problem, it will take a long term to solve; if it is a hard problem, we will need to build all sorts of new capabilities and capacities. To have the courage to invest in those, we need to know we are in it for the long term.
Q
Adrian Smith: Is that a question for me? It is probably a better question for Felicity. Going back to the earlier comments, a fundamental is to trust long-term commitment from the Government that we are really in this, and we have a plan with clear funding milestones and we will stick to that plan. That is what will give the international community the message that we are in it to be really serious. That serves two purposes: for the narrative of the UK, and as an attractor for brilliant people, whether they are in research or industry around the world, to come and join in this long-term challenge.
Professor McDonald: How do we attract them? The scale of the ambition will be a major attractor to someone, with that executive excitement and experience that they will bring. Large-scale ambition and, as we said earlier, a commitment to the long term to making this work for the UK, in that it is a long-term integrated approach. I suggest that the CEO would have to have experience beyond academia; preferably, as you have suggested, Minister, including industrial experience—that ability to take the journey from concept through to proof of concept, demonstration at scale and deployment. Ultimately, commercial exploitation is key.
I can assure you that the engineering community will be well engaged with this as we help to bring forward individuals of the right stature. Industry expertise and understanding should be a prerequisite for ARIA personnel. An interesting example, which many of our colleagues in the Committee will be familiar with, is the vaccine taskforce: bringing together industrial expertise—traditionally competitive companies large and small within their supply chain—with Government officials and the National Institute for Health Research. That was a fantastic microcosm of large-scale, high-risk and ultimately high-reward outcomes. In many ways, that gives us a precursor for some of the approaches and cultural changes that would be needed to take that forward. For the chief executive or chair of the board, it would be great to have industry-relevant background, a commitment to innovation and excitement about the scale and potential impact of the work that they are taking on.
Felicity Burch: I listened to a number of the earlier sessions, and I was delighted to hear about the focus from so many stakeholders on the need to build a diverse team within ARIA, but also thinking about the diversity of the community that we engage in it. One of my reflections is that we are trying to build something that looks a bit like US DARPA, but we are 60-plus years on now, and the international, national and social picture is completely different. We have an opportunity to build something that really excites, for the next generation of researchers and business people.
If you look at businesses that are trying to achieve those same goals and the practices they put in place to try to recruit brilliant people, you will see that, first and foremost, purpose really matters. Clearly defining the mission of what ARIA is trying to achieve when we get the team in place, making sure that it is something that excites people, having a clear market, and also solving national and international social problems will help encourage really bright, brilliant people to get involved.
Secondly, it starts with the senior team. We are building this team from scratch, and we need to make sure when the team is being recruited that it is diverse in the broadest sense possible—that we see women, ethnic minorities, and those with disabilities represented on the senior team for ARIA to really send a signal that the way we want to innovate in the UK is diverse and that we want to make the most of all our talents around the country.
Q
Felicity Burch: One of the really exciting opportunities from ARIA is the potential for joint ventures and engagement. Essentially, my answer here is pretty short. Go ahead and do it, but make sure you engage with business communities a bit further down the line in exactly the design of how those funding mechanisms might work. Different businesses at different stages of their journey will be interested in different funding mechanisms.
Q
Bob Sorrell: That is a great question. If you compare and contrast us with the Americans, there is a definite culture in the UK that failure is something that you hide under the carpet, put away and forget, but science is all about failure and pushing the boundaries. If you are not failing, you are really not challenging those boundaries. I think it is about establishing a culture in which we can accept failure and move on.
The problem comes, in both industrial and academic environments, in facing that day, because there is a tendency to keep things creeping along because you have invested so much effort to get them to this particular point. You do not want to kill it, because then you have to stop the project, and people feel emotionally involved in it. That creates a whole series of issues associated with it. It is about making the hard decisions and learning from failures. We describe them as failures, but actually they are some of the most valuable learning experiences that we gain, and they stop us reinvesting in making the same mistakes in the same areas if we are really careful about what we extract from them, and do not just try to shut them off in a box, in a rather embarrassed way, and say, “That’s something that we will leave to one side.”
Q
David Cleevely: As well you know, I am very keen on establishing networks of individuals and making sure there is lots of exchange. Part of the essence of putting an agency like this together is to ensure that you get a lot of cross-fertilisation. There should be a great deal of exchange going with that, and you would, of course, have to have in place the conflict of interest and various other peer-review processes.
It is very important that an agency like this would work closely with the private sector. My first encounter with DARPA goes back to 1977. At that point, I was working for Post Office Telecommunications, which shows how long ago it was. We were discussing the idea of funding this funny thing where you cut information up into packets. A lot of the collaboration that was done on all of that involved a great deal of what was then a monopoly, though a commercial entity, helping to fund those things. That kind of stuff is extremely important and needs to be built into the processes by which this agency operates.
Can I just pick up on the notion of failure? There are two kinds of failure. There is the kind of failure that we have seen with SpaceX, where you send a rocket up and you land it and it crashes or burns up after about 12 minutes because it is leaking fuel. That is one kind of failure. Quite honestly, the private sector got involved in replacing NASA because NASA became too cautious about dealing with that kind of thing.
There is another kind of failure where you have picked the wrong technology—the wrong way of approaching a problem. I think we are talking about the second kind, and about recognising how to stop that. That is a peer-review process; that is a way of making sure you do things. What we need to avoid is reacting to failure where the rocket is crashing on touchdown. That is not really failure; that is simply experimentation.
Q
Bob Sorrell: I would say three quick things. First, ensure that there is a real partnership between industry, Government and academia, in actually shaping the agenda for ARIA. I would have flexibility; we heard that earlier, I think, from a colleague from the CBI talking about models in which we could second people into the ARIA organisation. I think there is an opportunity to do that, and we have had experience of doing that previously.
The other thing is that ARIA provides some really important learnings, and it should be able to integrate those back into UKRI, and vice versa. UKRI has some valuable learnings that it can impart to ARIA. This is an evolutionary process through which both parties will definitely benefit, and it should be framed in that light.