Connecting Research to Impact: A Conversation with the CEO of Research Nova Scotia
Stefan Leslie shares his insights on funding research that responds to the needs of society
This week I interviewed Stefan Leslie, the CEO of Research Nova Scotia (RNS). RNS is an arm’s length, board-governed organization that deploys research grant funding on behalf of the provincial government. Notably, RNS was recently given a new mandate to fund research that will drive provincial economic growth, and they have just finished their first call for proposals under the new strategic plan they developed in response.
My goal in this interview was to learn how Stefan Leslie and the RNS team approached rebuilding their strategic plan to address their new mandate, in the hopes of coming away with a clear articulation of what it means for research to serve the needs of society that can be generalized to other Canadian funding agencies. Stefan’s commentary is insightful, and reflects a deep understanding both of the research he supports and of what it takes to move the results of that research beyond the lab.
While Nova Scotia may be leading the charge, I strongly suspect that similar changes are coming to the rest of the provinces in the near future. The strategic plan developed by Stefan and the RNS team will no doubt serve as a useful blueprint for both provincial and federal policy makers that are tasked with connecting research to societal impact.
Your email client will probably truncate this post. My key takeaways are presented at the end, so be sure to read the web version if you want to get the whole story. Many thanks to Stefan for taking the time to share his process.
Interviewer’s note: Stefan Leslie approved the final version of the section entitled “Interview with Stefan Leslie” and had editorial input on that section, with the option to rephrase and expand on the ideas discussed in the interview without changing or removing any intended meaning. The key takeaways presented at the end are my own commentary, and do not necessarily represent the views of Stefan Leslie or Research Nova Scotia.
Interview with Stefan Leslie
KB: Tell me about yourself. How did you find your way to leading Research Nova Scotia?
SL: I actually came into this from managing fisheries: 12 years in Canada and 4 in New Zealand. When I say managing fisheries, I mean for the government authority, in Canada for DFO and in New Zealand for the Ministry of Fisheries. This included stock assessments and meetings with fishermen in the local fire hall and all that kind of stuff.
What was interesting about coming from that direction was that it is a very science-based activity where you have a very direct connection between the research and science that’s being done and resulting management and policy decisions. That includes fundamental research work around how stock dynamics operate, and then very applied stuff like figuring out how we can exploit this stock while remaining sustainable. I also recognize, having worked in that environment, that you cannot optimize just based on the outcome of a scientific process. People are involved, and they have to come up with the trade-offs and develop the normative choices that come with competing interests around science and research. I also came to recognize my own limits as a public servant and both the limitations and the advantages of working in that kind of environment.
That background then led me to lead a Network of Centres of Excellence, which is a program that no longer exists in Canada as it was cancelled four or five years ago. This was a federally funded program embedded in a host organization, typically a university, that did very purposeful science. It was to pursue certain outcomes that were of importance to the country, develop high-quality work, and also train the workforce, engage in knowledge transfer, and all the rest of it. But it got me into the academic environment and illuminated the value that academic research, or research that happens in the universities, can bring across a whole bunch of different dimensions of Canadian society and the economy. It also highlighted for me the value of an arm’s length, non-profit-style board-governed world.
When Research Nova Scotia was started, and the conversation in the province of Nova Scotia really kicked off around 2015 or 2016 and had about a 3- or 4-year gestation period, the idea was to create an arm’s length organization devoted to funding “purposeful research” with research as the basis for this pathway to prosperity and societal improvement. That was the sentiment in that time, in the pre-pandemic world. So as they wished to launch this organization, I was very interested because I thought that it was an unexplored dimension in the provincial interest in science.
I wanted to explore what this meant that’s different from straight economic or federal interest, because the dominant theme of research is often driven by federal considerations, but the provinces are actually interested in different things. They of course are managing universities, but also research hospitals or healthcare systems where health research happens. So there’s a provincial jurisdiction element. Provinces are responsible for the delivery of the majority of public or social services that you and I and everyone else need and enjoy and comment on on a daily basis: the education system, the health system, transportation, economic development, and so on. As a result, provinces have a real interest in the outputs of research, and are asking the question as to what are we actually learning from research that we can then apply in a direct way.
KB: It sounds like part of the goal of Research Nova Scotia, even before the recent mandate change, was to fund research that would have a direct socioeconomic impact. Is that accurate?
SL: From the outset, I should probably say that in Nova Scotia, and probably other provinces, there have been various approaches to supporting research since at least the 1930s. Research Nova Scotia, although it was created in 2019, is at the end of a long train of a variety of different options and selections and ideas around how the province can support research. This is attempt 7 or 8 at this particular problem, so we’ve tried every sort of model. The immediate predecessor organization had really focused on health research and leveraging federal funds primarily through the Canada Foundation for Innovation. So there really always was this purposeful element of connecting the public investment to issues that matter here. What needed to matter was the research outputs: what we learned, beyond just what we were funding, research excellence, and the people trained. What value could we actually extract from it?
For the first five years more or less of our existence, the province had established research priorities in regulation itself. They were fairly broad and enabling: inclusive economic growth, a healthy population, and a strong healthcare system. Quite broad, but nevertheless indicative of a direction that they wanted to take. Our first approach within that purposeful or intentional model was to develop an outcomes- or missions-based outlook in which we would define fairly broad but meaningful outcomes and we then invited researchers to come and creatively articulate how they could deliver those outcomes. We were quite open for researchers to provide insight into how they could devote their skill and craft in service of achieving that on behalf of the province.
There are diverse ways you can divide up the types of research that exist. You can break it into basic or fundamental and applied and experimental design, like the way the OECD does. Sometimes you can throw in curiosity-driven or investigator-led research. I don’t think necessarily that these are all that helpful as distinctions. We were never in the game of funding discovery for the sake of pursuit of knowledge, as that was not the intent at the provincial level. There was a desire to use scientific tools to fill a particular need, however, and that can call upon the need to engage in fundamental research. If that’s the limiting factor to achieving whatever societal outcome you want then that’s what you ought to be pursuing. We looked at it from the perspective of identifying the “pull” and then deriving the research need from that, regardless of whether it is applied or fundamental by that set of definitions.
As you pointed out, though, there has been a significant shift in what Research Nova Scotia has been asked to do, which has been underway now for the past about eight months. It was initiated by a change in the legislation that underpins what Research Nova Scotia is here to do, in the spring sitting of the 2025 legislature. Many things are still the same: we’re still an arm’s length organization, we’re still independent, we still have a board of directors, we’re still researching areas of provincial importance, and we are not, consistent with iterations of the research endeavour in this province, performing research itself. Instead we’re coordinating and funding others. We are not intervening on choices around methods, nor are we controlling outputs, none of that stuff that flared up during the Harper years that has sparked discussion around political control over science, none of that is part of it. But there are a number of really important changes that have happened.
The first is that under this revised legislation, the Minister of Advanced Education, which is the part of the crown to which I have a reporting relationship, now has the discretion to establish research priorities. So it’s been taken out of regulation and given to the minister. You could say that that’s just taking your coins from one pocket and putting it into another, because it is still a government process, but the regulatory process and a minister’s process is really quite different, or at least it can be, since the minister now has in their sole discretion the ability from time to time to determine those research priorities. That’s a critical difference for us.
The second is that the minister did exercise that discretion very early on under the revised mandate by giving us an overtly economic focus. So we now have to demonstrate advancements across 7 key outcomes which are grouped into two economic areas: economic growth and economic productivity. More specifically, provincial economic growth, and provincial economic productivity. So everything we do, and our authority to spend public money, depends on our ability to connect the research we fund to the achievement of outcomes within those two areas.
Third, which is related but quite important, is that there is an expectation of our ability to connect research outputs to those outcomes in a very direct way. The expectation from the minister is to show how research is contributing to measurable changes in those outcomes. A lot of weight that is held by the word “is” and “measurable”. “Is” really does imply that there has to be that obvious connection and “measurable” means you have to be able to point to some evidence.
The fourth major change is that we have had an adjustment of the priority sectors.I mentioned before that it was inclusive economic growth and healthy population and a healthcare system, and now the effort that we must devote ourselves to has been diverted to three different sectors:
Natural resources, climate change and clean energy;
Life sciences and health sciences; and
Construction and transportation.
So that has been the shift that happened on the 1st of May of last year of 2025.
KB: You’re one of the first if not the first provincial research funding agencies that has been asked to reorient towards economic impact, but I don’t think you’re going to be the last. Tell us about the process you undertook to develop your strategic plan in response, as a guide to those who may be asked to do the same in other provinces.
SL: There was no transition period. It wasn’t as if we were advised that we needed to change our priorities and get there in 12 months. We had the new set of priorities, and our research activities needed to respect them, effective immediately. We devoted the summer to developing a new strategic plan, and in fact the act specifically directed us to create and implement a strategic plan to deliver on our new mandate. That’s good practice anyway, but nevertheless it was helpful to have that pointed to us.
We went through a relatively formal iterative process where we engaged with insiders, got some drafting done, sent material out, and gradually over the period of about three months refined the ideas into a strategy that then went to the board for further refinement and then was approved in October. So approximately six months from creation to adoption phase.
It has three different areas of focus. The first, which almost sounds comical to state, but I think is actually really important, is that we still fund research. The reason I dwell on this is that it matters what people call “research” because it’s actually a term that carries a lot of weight in a lot of different ways depending on the circumstance. If you go back to the Frascati definition of the OECD, it’s got these aspects of novelty and creativity and uncertainty and all the rest of it, and so for us it has to be more than a phrase, which I thought was a really useful distinction point that I learned from this engagement process.
A colleague organization in the province described what they did within a particular sector as “managing scientific activities to resolve knowledge gaps”, and that is a perfectly reasonable definition of what research is, but for us it had to be a little broader. We needed to see absolutely the connection of the research outputs to realizing those outcomes, but we take broader view of research. For RNS, the discovery or the knowledge that’s gained through that research process has to be applicable beyond the immediate circumstances by people who weren’t involved in the initial investigation. This differentiates ourselves from simply supporting R&D, that may happen either in a company or an industrial participant or a government department that has to deliver on a policy mandate.
The second, consistent with that interpretation of what research is for RNS: we adopted a sector-level view. So that our support for solving problems, removing barriers, pursuing opportunities, whatever it is, would be applicable or relevant to a number of participants.
Senator Deacon, back in October or November, put out a report talking about all of the different federal level programs that support R&D, everything from tax credits to direct support, and there’s hundreds of them. Our view was that there’s no point in us simply being number 135 where there had been 134. Rather, we asked how we could incentivize collaboration within industry and make sure there was a connection between industry needs, and other indications that the research will produce economic value, and the research performers, whether they’re at a university or a not-for-profit or even within industry. So we were looking for opportunities to incentivize this collaboration where there are common challenges that have an economic benefit but are broader than just what one company needs in order to sell a particular product, improve a particular process, reduce their cost structure, or whatever it might be.
That leads to the third main takeaway from our strategy. Consistent with the first two parts, it behooved us to create a couple of different pathways to research support. First, we have a convened model which we call Focused Research Investments, which are large, they’re actively coordinated, there’s multiple component or constituent projects that fall out of it, in which we are looking for active engagement with other funders to really advance in an area over a period of multiple years. So you pick a sector for which there is an opportunity to manage a risk, or pursue an opportunity, and you say that, by virtue of systematic and consistent application of money, time, and talent, that you can really advance things along. Second, this is matched by a more responsive model, which enables those who have an idea, or their own challenge that they can cogently describe, to get funding over a shorter period of time toward resolving that challenge. You don’t have to have all that other bells and whistles around coordination here. You do need to be able to articulate how it fits our definition of research, but you don’t have to coordinate a larger scale effort. This is the opportunity for which our first intake round just ended. With that first round, applicants are being invited to propose what they think is important for us to pursue over the long run.
In both cases, whether you’re coming through that convened larger model or the responsive model, you need to be able to demonstrate a plausible pathway to impact. We can discuss the economic model later, but the summary point here is that applicants do have to demonstrate that the need of whoever’s going to apply the research, which has to be outside the research community, shaped the direction of the research that’s occurring. Applicants need to understand that particular pathway. This is different from promising success. Rather, applicants need to demonstrate that they have thought about who and where and when information moves through hands or through different parts of the value chain, to go from the research sphere to the application sphere. Just as important is the demonstration of an effective mechanism or governance structure within the research component that can make adjustments or accommodations as you learn to ensure that your focus is always on that outcome.
We have lots to learn on whether this is going to work, but by taking these two different models and keeping our eye on that application phase, we’re hoping to back that into a model that will allow us to identify and then track success of projects over time.
KB: Your convened model, and in particular the idea of removing the separation between research and industry, has some parallels to a more agile version of DARPA. Was that a source of inspiration?
SL: Certainly DARPA, or Skunkworks is another one that’s commonly brought up, are really great for engineering challenges where it’s easy to identify a measure of success or failure. To use the Skunkworks example, if you are designing a spy plane, you can tightly specify the design and success criteria: it must be able to fly at 90,000 feet and get a resolution of X, for example, and then you have either a spy plane, or you don’t, that meets those design criteria and you can design a research agenda around that. There’s certainly an element of that.
I think what’s different here is both the need to find a way to integrate the existing organizational structure of how research actually happens in this country. Nova Scotia has 10 universities, a community college with multiple campuses, and two or three health authorities delivering healthcare, but we have to recognize that there’s a strong social component to whether something actually happens. It’s not just an engineering challenge.
I’d say the inspiration was partly that DARPA model and the various other ARPA-[add-a-letters]. There are certainly other innovation models, and the US, for all of its challenges around science these days, are extremely creative in trying different models and different ways. Philanthropically supported research is considerably bigger there than it is here (although I think it’s growing here), convergence research, developing the Focused Research Organization model, speculative technology, working on deep tech challenges. There’s the UK version of some of this the Advanced Research and Invention Agency or ARIA that’s been around for maybe two or three years, Deep Science Ventures, various other organizations, etc. I’d say that what is unifying all of these is that they’re experimental and adaptive. It’s not just that they have this connection of a societal or economic or engineering need into the research community. I think what is kind of more special about them is that they are recognizing that you need to adjust the model according to what you’re trying to achieve, be honest about or or look carefully at what it’s taken to succeed, or not, and adjust going forward without trying to determine what the model is that you’re going to hold true to forevermore.
KB: Given the number of unknowns between research and economic impact that may have nothing to do with the quality of the idea or the researchers involved, how are you planning to connect the research you fund to the downstream impact?
SL: This has been an area which has obsessed us since the economic mandate for RNS began last May. But more broadly, this has proven to be a vexatious problem for at least 70 or 80 years. It has been the subject of considerable attention for probably 50 of those years. It is not one that is easily cracked. A few general comments before we get to the more specifics, because I’ve got a couple of different ways to answer that question in a more concrete way.
I think it is well established, and probably beyond contention, that there is a public or a social benefit to public investments in research and development. There’s been a variety of different papers on this, and probably the most recent or maybe best known was by the National Bureau of Economic Research in the US, where Lawrence Summers and Benjamin Jones looked at the social value of research. It is very difficult to find a study that doesn’t give you a multiplier of somewhere on the order of 4-10. For every dollar spent on research, you get four back in social value. The issue is that it really matters where that accrues.
This is the challenge around R&D: you have a high degree of uncertainty in where those benefits are felt, but you have high certainty over where those costs originate. So you have what you could probably characterize as benefits that are diffuse; distant; hard to attribute to an individual supporter, and that are often disconnected from the original investor or research performer, but the costs have all the opposite characteristics. They are coming from a very specific pot of money. Even when R&D is done to improve a very specific thing, say a manufacturing process, it will be on the order of years before you see the application of that knowledge to the improvement of that particular process.
There was a study done not too long ago that looked at the timeframe between publication in a scientific journal, where the research has been “concluded”, and the application for a patent. Not the use of that patent, mind you, just applying it. That’s on the order of three to six years So, if you’re looking for evidence of economic benefit from the activity of research, you have to think about the timeframe of years. The more fundamental the research question, the longer that timeframe is. For basic research, we’re talking probably 20 years.
[[interviewer’s note: Stefan’s numbers line up with the NBER study linked above, in which they suggest that “studies of R&D, product introductions, and product sales suggest quite rapid linkages between up-front costs and peak market payoffs. A total delay of 3.6 years appears reasonable, and a 10-year delay appears very conservative.” (p.14)” and that “”citation network analysis suggests that even more remote basic research investments begin paying off within 20 years.”]].
You can always point to, say, the connection between someone studying the spit of a Gila monster and the production of Ozempic, or you can point to research into how bird beaks were formed through an evolutionary process connecting to methods to improve the aerodynamics of high-speed trains. That all happens, but it is a very tortuous path, and you can make those connections only in retrospect. It’s very difficult to do it looking forwards.
Now I want to answer your question in a more specific way, with two different perspectives. The first is rooted in evaluation. Looking back, how do you demonstrate that research you’ve funded has resulted in value. As I’ve just spent a lot of time describing, you need something that goes well beyond the timeframe of the research project itself. You need to be able to reach well beyond the end of the research, and probably coordinate with others to be able to track it back, and you need to focus as much on efforts and intent as on results. You can’t hold a researcher to account for a failure to realize economic return because there are so many aspects that are going to either frustrate or amplify creation of that economic value over which they have no control, but you can certainly expect that there is effort and intention behind what they do. I think that the metrics you need to look at need to allow you to develop a sensitive understanding about why something worked, or why it didn’t work, so that you can apply those learnings to future investigations.
This is where I need to get to the second area that represents a very different way of thinking about the problem. How do you actually select research projects that have the prospect of economic benefit, given the timeframe and the lack of attribution. Basically, how do we decide what represents the best bet? If you line up 10 ideas in front of venture capital, they’re going to have about a 10 percent success rate. They know what the product is, who the principals are, what the market is, and what the financing looks like, and the success rate is still that low. So, how can you provide information to someone like us, that is being asked to select research, which is well before you get to the point where VC would ever take a sniff, that signals a chance at success?
I think the only way you do that is you look at this in a different way. You have to completely change what you’re looking for. You need not focus on evidence or expectation that whatever comes out of the lab is going to be applied, but instead look for indicators of how the problem within society that needs to be solved is influencing what goes on in the lab. We need to move away from the idea of moving knowledge out of the lab, this “knowledge translation” language we hear so much about, and instead flip it around and consider knowledge translation in the opposite direction: how information from the social side informs the research sphere.
So we need to completely change what we’re looking for, and focus on evidence of how the needs of society or industry shaped the research endeavor. It’s more than “I had a question, I know how something happened and I set about to understand it and hopefully it’ll be applicable”. Instead, it needs to be “are the questions that I’m asking directly inspired by or
The patron saint is like Louis Pasteur. Abraham Flexner, the guy who who started the Institute for Advanced Study at Princeton, doing the purest exploratory research, had all the time in the world for Pasteur, even though he even wrote an article on “The Usefulness of Useless Knowledge“. He talked about Pasteur taking inspiration from the real needs of people and converting those unknowns into fundamental challenges.
We need to take needs and convert those into real problems. Transform the applied into the fundamental rather than take the fundamental and try to apply it. So we need to out that process. I think that the many researchers who do applied research do this in an intuitive sense. They are not cloistered and kind of just come up with stuff in which they’re interested. Even for curiosity-driven research, that curiosity comes from somewhere. The challenge for us is to shine a light on the process transforming an applied problem into a research challenge that can give us a solution. It’s difficult to answer as far as specific metrics go, but I think what we need to start with is to clearly articulate what we are asking people to show evidence of, and engage in that conversation first.
KB: Building on that reframing of “purposeful research”, how do we go about identifying the problems that should inform the research we support? If there’s an existing company with a problem, it’s easy. What about broader societal goals that may not have an existing industry champion?
SL: Yes, the obvious case is where you’ve got a technical challenge. You know, if you’re Louis Pasteur, it’s fermentation in wine - it’s an economic disaster if they can’t sell their wine, and so that lends itself to an obvious set of challenges. The genius there is to broaden that into a wider set of applicable ideas.
What we need to be able to do here, and in fact what we’re required to do in order to give due response to the instructions we’ve been provided by our minister, is to begin with the three sectors that are the areas of focus within this economic model, and extract from that the research challenge components, because not everything should be treated as a research problem. Research is the right instrument to use when there is something we need to learn to better understand what to do. Under those circumstances, the application of scientific methods will get us closer to better, more informed choices. But there are many challenges where research is not the limiting factor. What’s preventing the improvement is something else: making the trade-off between competing benefits; devoting sufficient time or financial resources to something; or management or administrative competence. In those cases, calling for research is a diversion. There may be good things that come from that research, but it won’t be to improve the immediate problem at hand.
In an economic environment where the economy grows by industry participation, you could also look at through the lens that saving public investment means that you are reducing demand on public resources. But typically those are simply then drawn into other areas of public investment so it doesn’t necessarily produce cost savings. But if you accept the premise that you need industry to make use of what is produced, then the question of what should be the public investment versus what is a private investment opportunity becomes important. I think the appropriate area for public investment is where you have broad need, across a number of participants, where there is insufficient incentive for any one individual organization to pursue that particular work. You cannot expect an industry participant to take on that particular cost, because the spillover benefits will be to others. That makes a very strong argument for why the public sector ought to be involved at this stage, because it is going to provide broad sustained support to industry in that particular sector.
KB: Under the new strategic plan, does Research Nova Scotia take an active position on governance of the intellectual property coming out of the research you fund?
SL: One of the outcomes that we have been asked to produce, one of the seven that the minister’s given us, is to enhance commercialization of research, including retention and deployment of Nova Scotia IP in the province. So in the minister’s mind, the province is already thinking about this connection between who owns the IP and how it’s deployed in service of economic growth in the province as opposed to kind of the simpler question of what’s going to create jobs or whatever. So that was already part of the broad context in which Research Nova Scotia operates.
So as a result, we evaluated whether it made sense for us as an organization to take some sort of equity position or ownership of IP as a result of the research in which we were going to invest. We had a hard look, a really hard look, at a UK program which ran for about four or five years, where the model that they had was to invest public money in R&D programs executed by industry, but the public money retained the right to take ownership of any resulting IP if it wasn’t used within a particular timeframe. So if IP is created, if nothing’s done with it in that timeframe, then the public owns that IP. They went through three or four funding rounds and they were in an experimental and adaptive mode as we were discussing a few minutes ago. The reason we didn’t go down that path was that, in discussion with the developers of that program, they kept changing and trying to get closer and closer to delivery of a better program, and I realized that they had never actually exercised that opportunity because they didn’t know what to do with it. Just because they owned it wouldn’t necessarily then result in improved deployment of that IP in service of the ultimate outcome. Holding the patent is nothing, you have to actually then build it into something that then creates value.
I arrived at the conclusion that we need to stick with what we’re good at, and instead turn IP management into an area that an applicant for our research funding has to think through and shape and articulate to us. I think that this speaks to a difference between prescribing a set of rules that everyone has to work within, or developing an enabling framework with some flexibility that acknowledges that we might not know what the IP environment has to look like, but make clear to applicants that that part of our job is to ensure that there’s increased deployment of IP in Nova Scotia for its economic benefit. So we ask applicants to demonstrate to us how their approach is going to meet that outcome. Rather than us claiming IP or trying to manage it, applicants know they will need to develop a results distribution plan and that plan may be to share it broadly, it may be to hold it close, but it needs to be the right approach in context to deliver on the target outcome. It shifts the burden over to the applicant but it allows it to be flexible depending on what area they are working in. It’s going to be very different for deep tech versus process improvement. What IP actually means is going to be very different, and what’s patentable, and so on.
I have every interest in ensuring that the research we support creates commercializable value that is then picked up either by the performer or by whomever ownership passes to in order to mobilize it in service of society. But I’m very aware of my limitations and I do not assume I know how to do that better than them.
We need to be clear about what actually creates economic value. Is it the invention itself -- the output of R&D? Or is it the application of that invention to industrial activity, or cost-saving public service delivery? In my view, it is the latter. There is some economic value to R&D activity -- researchers are paid – but also there is typically greater opportunity to build economic activity adjacent to the R&D efforts. This is not an automatic process. Some clear-eyed assessment needs to be done about what core ingredients determine where R&D-driven economic activity will actually occur. If, in Nova Scotia, we have few to none of those ingredients (e.g. labour profile; domestic market; proximity to export; low carbon energy, etc.), then being the R&D centre is unlikely to result in the economic activity that comes with the application. I like to think about where we have a competitive advantage on applications (e.g. manufacturing or other forms of R&D deployment) that already exist or could be built, and focus R&D efforts there. These can be small niches (that’s inevitable with a population of 1M), but that’s where the opportunities lie. I like the concept of ‘conversion’: converting R&D results to application.You need to actually pay attention to not just your research strengths, but also the path through to converting that into value, in whatever form it may take.
KB: A central theme of your approach to all of this is “intention”, understanding and being tolerant to the fact that intention will not always translate to reality. Having researchers articulate that intention leaves lots of flexibility, but shifts the burden onto you in that your reporting systems have to reflect that flexibility. How are you thinking about making that open-ended reporting practical given the variety of approaches that you’re going to encounter?
SL: I should start by saying that like every publicly funded or taxpayer funded organization, there is an expectation from the people who fund us that we will be able to report convincingly that we are managing those funds responsibly and to further their aims. Nothing of what I’m saying is trying to shirk that sense of responsibility. But what I report has to be real. I don’t want to go to them and say that we can come up with a set of metrics that will allow you to track the dollar to the benefit when that is not a realistic expectation, and moreover I don’t think that’s really what they think is feasible either.
What’s really important is that, in everything we do, we understand what they’re trying to achieve, we take it seriously, and we adjust our processes and the people with whom we work and the type of research that we’re supporting, and we follow up wherever and however and in whatever creative ways we can to ensure that we’re able to report on outcomes and to take corrective measures. That is always going to involve a combination of two approaches. The first is a formalized approach in which we require certain reporting or provision of data that we can then assemble to a report. The easiest part of that is based on having a financial relationship with someone because the research process is underway, where they provide us with information and we give them funding in return–we can compel them. That’s easy.
The second part, and the only way this is going to actually work in the long term, though, is to maintain that reporting relationship over the much, much longer term, longer than we’ve yet been on this earth as an organization. How that’s going to happen precisely is difficult to prescribe at this point in time except that we have been doing this in a slightly different form for the first six years, and I can tell you that the way it actually happens is by cultivating a productive and trusting relationship with the researchers that you support over the long term, through which we build confidence that our role is not to provide oversight and pass judgment on their quality, but that our role is to enable their success.
“Success” here does not mean that I particularly care whether they’re going to get tenure or promotion, or publish in Nature or Cell, or whatever– those are the accoutrements of success. By success, I mean that the research that they are passionate about, and really want to see happen, is able to proceed through the system to its logical conclusion. That doesn’t mean it necessarily always works, and so cultivating that personal relationship means that they’re comfortable that when a challenge occurs, when supplier doesn’t have something, when something’s going to cost more than planned, when students who they thought were going to come can’t because the federal system changes and they can’t bring in a foreign student who was going to be the postdoc working on the project, or whatever the issue is, we want to be their first call. We want to be able to ask them how we can help overcome the issue. I think that’s how you build an understanding over the longer term of how productive work happens. I think mapping the journey is almost as important as how many people they hired, whether they filed a patent, and where that patent went. It’s going to be part of it but it can’t be all of it.
So yes, it shifts the burden to a more qualitative or narrative model, but I think ultimately that becomes more meaningful in terms of actually giving insight to the people who have given us the funds to further this kind of work as part of the many things they could spend their money on.
KB: You are leading the charge on a priority shift that I expect is going to come to the rest of the provinces, and probably the federal research agencies as well, in the near future. Having done this, do you have advice for your counterparts in other provinces that are asked to similarly reorient their research funding toward economic impact?
SL: You’re right that this is happening everywhere. Conversations we have with other provinces, at the federal level you talk about the Capstone Agency, mission-driven research, and dual-use tech etc. are all part of the conversation here, but in Australia, UK, the US, New Zealand, etc. everyone’s thinking about how granting agencies operate and and I think it’s important to consider that research ecosystem is very complex.
The shift is not just toward an economic focus. It is also in recognizing that there’s room for and a really powerful role for active coordination of research rather than just simply passively running a competition to see who applies, picking the best based on some research excellence metrics, and hoping for the best. I think that’s part of how research beauty happens, but it’s not the full thing. You need to round it out with a recognition that certain societal challenges need a different model. All the countries I just listed, and many of the different provinces, are faced with this challenge as well. I think maybe we had to confront it a little sooner in a real way than others and so I think, to revisit maybe some earlier thoughts and give emphasis, this idea of experimentation and adaptation is crucial.
The model that is has been used the dominant model that’s been used to identify research – say the peer-review system and all of that – has been relatively stable but that continues to undergo experimentation as well just in more subtle ways. We should expect, as we move toward a direct connection between economic activity and research outputs and a more coordinated model, to have a higher degree of adaptation or or experimentation early on.
The second observation is that for research funders and coordinators like us—which it to say, relatively small ones—are pulling macro-level levers. We are designing research funding programs, we’re engaging with universities as almost conceptual organizations, as groups of scholars, but this is going to work or not work by marrying those macro-level concerns with the micro reality. The micro reality involves, in academic settings, things like tenure and promotion systems, university senates, and the various decision-making structures within universities. In industry there’s the reality of supply chain, the cost of borrowing, and the competitive environment that’s changing. People are managing micro-forces and we’re pulling macro-levers. I think that understanding the dynamic of how those two aspects work together is going to be a determining factor in whether programs that we design and engage with are going to be successful or frustrating, or maybe how much they have to change.
I think that this has to be more than just the simple discussion around leveraging funds. A lot of the conversation revolves around the fact that when there is a new federal program, what’s going to be the percent of federal investment versus provincial investment. I think a more nuanced view of it is to consider what the federal role is, versus the provincial role. The outcome of a proper partnership-based discussion will be a funding model. But what matters when building R&D funding models that have both federal and provincial components is true reflection of what the different jurisdictions’ needs and capacities are. When I say ‘needs’, I mean that within a purposeful research program, those purposes are legitimately different. That is a good thing: let’s draw out where those purposes align, and where they don’t. The idea of ‘capacity’ is bigger than just financial: it includes things like decision-making structures and timeframes. This may express itself in a different matching formula when it comes to money but a partnership model actually looks at what each of the partners wishes to achieve and then finds something that in combination strengthens the position of the component parts.
I think that identifying and managing the spillover benefits and where they go really matters. This is bringing back the point that, at the provincial level, when we are directed explicitly to increase provincial economic growth and provincial economic productivity we really do need to pay attention to where that conversion into value is going to happen, and where R&D mastery converts to economic opportunity. We have to care about that. That’s not because we’re Philistines and don’t care about other issues in the world, it’s that we’re dealing with provincial dollars from provincial taxpayers. It’s the same person that’s a federal taxpayer, but in that sense they are contributing to a different set of interests than at the federal level.
The final thing, because it is something that we have really grappled with over the past eight months and I can’t imagine anyone doing this kind of work would fail to grapple with these two issues, is that coordination is really difficult. Coordination amongst funds that have different decision-making structures, timeframes, and mandates just takes time. Second, a lot of where research value can exist is in potential savings of public investment, say in healthcare. It’s really important to look at what those kinds of innovations can do to simply reduce demand in other areas. If you’re really focused on economic growth you actually have to look at what is going to grow the economy, not necessarily displace one expenditure for another. On the surface it seems somewhat simple to say “we are now devoted towards economic growth”. Understanding what economic growth actually looks like does matter even when considering what kind of research projects are best suited to achieving that kind of outcome.
KB: Is there anything I should have asked you but didn’t?
SL: I always ask people if they are optimistic. I’m personally optimistic even if I can be somewhat globally pessimistic. I think it’s a very challenging time, but I think that constraint is often a precursor to creativity. So this is, I think, the source of my optimism.
People ascribe this quote to either Ernest Rutherford or Winston Churchill but it goes something along the lines of “We have not got any money, so we have got to think”.
I’d love to have lots of research dollars to dispense. There’s lots of good work, there’s endless amounts of work and endless numbers of people who are doing great stuff, so I’m not saying that we need to be restricted in that regard but nevertheless it is to a certain extent those restrictions, which are not just financial but directional, and the reality of how the system works, that do offer us the opportunity to be creative and try something new.
I think there’s a reason it’s happening or it can happen here in Nova Scotia because we are in somewhat of a unique position. We don’t have the complexity that comes with the size of some of the other provinces. We are not nor will we ever have the budget of Ontario, Quebec, BC, or even Alberta. But we do have a substantial research history and research capacity that we can mobilize. We’re at that mesocosm level of having some of the elements that enable really high-quality, functional research to occur, but it is a population of about 1 million people that’s it, so we can connect on almost a personal direct basis to all the people that we’re working with. This is what gives me optimism: we can be and we should be and should be seen to be by others to be worth watching as an experiment to see what can be drawn into other jurisdictions as well. I’m an optimist in that regard.
Key Takeaways
Research with purpose
In the debate around Canada reorienting its research infrastructure toward the needs of society, I often come across the perception that funding research with the goal of socioeconomic impact is somehow incompatible with research conducted for pursuit of knowledge. As a result, the various words we use to describe the distinction between different research goals have picked up some baggage. Various commentators, including myself, tend to use simplistic labels that divide research into one of two categories variously called “fundamental”, “basic”, or “curiosity driven” and “applied” or “demand-driven”.
Stefan argues that this is too simplistic a categorization to be useful. Instead, he adopts a more nuanced framing around the idea of purposeful or intentional research.
“We looked at it from the perspective of identifying the “pull” and then deriving the research need from that, regardless of whether it is applied or fundamental by that set of definitions. […] For RNS, the discovery or the knowledge that’s gained through that research process has to be applicable beyond the immediate circumstances by people who weren’t involved in the initial investigation.”
In other words, it does not matter whether the research is fundamental or applied, what matters is that it responds to a clearly identified societal need. If the barrier to addressing that need is basic research, then that is what should be funded.
The challenge, then, is two-fold: identifying the need, and articulating clearly what kind of research needs to take place to address it.
If we consider the various existing research funding programs that focus on research commercialization (NSERC Alliance, I2I, and others), we find that they typically require an existing industry partner to co-fund the research, based on the idea that this signals clear private sector demand for the research. This is “demand-driven” research, which, while a subset of purposeful research, misses opportunities to address societal needs that do not yet have an industry champion. The RNS approach is more inclusive:
“You need not focus on evidence or expectation that whatever comes out of the lab is going to be applied, but instead look for indicators of how the problem within society that needs to be solved is influencing what goes on in the lab […] we need to completely change what we’re looking for, and focus on evidence of how the needs of society or industry shaped the research endeavor.”
In other words, funding agencies should look for clear evidence that their applicants are proposing research that is responsive to a need that they can clearly identify, and that the need has informed their research plan.
Acting on this involves a degree of flexibility on the part of the grant funders: while it is very easy to use the existence of an industry partner as evidence of societal need, we will have to do better if we are to build research funding systems that are more broadly responsive. Not all worthwhile problems have a company trying to solve them already. This has been a significant challenge for any public sector granting agency that seeks to fund research broadly, since it is impossible for public servants to be technical experts in all the areas of research they will encounter in grant applications. Stefan freely acknowledges the challenge:
“I have every interest in ensuring that the research we support creates commercializable value that is then picked up either by the performer or by whomever ownership passes to in order to mobilize it in service of society. But I’m very aware of my limitations and I do not assume I know how to do that better than [the researchers].”
Stefan’s reframing of what constitutes purposeful research provides a clear blueprint on which a more responsive research funding system should be based. The applicant is the technical expert. Instead of attempting to evaluate a proposal primarily on its technical merit while using industry demand as proxy for societal need, our research agencies should look for evidence that a proposal responds to a clearly identified need, and that the research plan is informed by a nuanced understanding of those needs and can articulate how the proposed research seeks to overcome the immediate challenge to addressing them.
Measuring success
Where research is concerned, intention will not always (or more accurately, will usually not) translate to the desired outcome. RNS has been handed a significant challenge in being asked to connect the research they fund to economic impact, since there are an enormous number of steps between the lab and societal impact that have nothing to do with the quality of the idea or the expertise of the researchers involved.
You can’t hold a researcher to account for a failure to realize economic return because there are so many aspects that are going to either frustrate or amplify creation of that economic value over which they have no control, but you can certainly expect that there is effort and intention behind what they do.”
The path from lab to market is often convoluted, and is only ever clear with the benefit of hindsight. This makes tracking outcomes extremely difficult, and leads to the typical proxy metrics that are used in most Canadian research programs: ;number of patents files, number of jobs created, etc. In my view, these should be considered input metrics rather than indications of output. Impact must be tracked on a much longer timeframe, one that is sensitive to what is reasonable in the context of different types of research. Stefan’s commentary supports this view:
“There was a study done not too long ago that looked at the timeframe between publication in a scientific journal, where the research has been “concluded”, and the application for a patent. Not the use of that patent, mind you, just applying it. That’s on the order of three to six years So, if you’re looking for evidence of economic benefit from the activity of research, you have to think about the timeframe of years. The more fundamental the research question, the longer that timeframe is. For basic research, we’re talking probably 20 years.”
Stefan notes that even when research eventually does achieve impact, the impact may not be felt in the same place the research was conducted.
“This is the challenge around R&D: you have a high degree of uncertainty in where those benefits are felt, but you have high certainty over where those costs originate. So you have what you could probably characterize as benefits that are diffuse; distant; hard to attribute to an individual supporter, and that are often disconnected from the original investor or research performer, but the costs have all the opposite characteristics.”
To respond to their mandate of provincial economic growth, RNS has to be clear-eyed about what it means to succeed, and they must be able to track the outcome of their projects over the long term, and that intention is just as important to outcome tracking as it is to identifying research worth funding:
“.[…] you need something that goes well beyond the timeframe of the research project itself […] and you need to focus as much on efforts and intent as on results.
Given the variety of research projects being funded and the diversity of ways in which research can respond to societal needs, there is no one-size-fits all approach to outcome tracking that will be effective. Stefan makes clear that rather than attempting to rely on a simplistic set of proxy metrics, their approach will be based on relationships:
“By success, I mean that the research that they are passionate about, and really want to see happen, is able to proceed through the system to its logical conclusion. That doesn't mean it necessarily always works, and so cultivating that personal relationship means that they're comfortable that when a challenge occurs […] we want to be their first call. We want to be able to ask them how we can help overcome the issue.”
Finally, iteration is key. Given the complexity, and the high probability that not all intention will translate to outcomes, the value of a tracking system is less about reporting on impact and justifying the money spent as it is about making sure that lessons are learned and incorporated into future funding calls.
“I think that the metrics you need to look at need to allow you to develop a sensitive understanding about why something worked, or why it didn’t work, so that you can apply those learnings to future investigations”
Outlook
Public funding for research has the potential to be enormously valuable to Canada. Just as not all research has an existing industry champion, not all societal needs can or will be addressed through private initiatives. In Stefan’s words:
“I think the appropriate area for public investment is where you have broad need, across a number of participants, where there is insufficient incentive for any one individual organization to pursue that particular work”
The commentary to this point makes clear that the debate should not be over what balance we should strike between funding fundamental or applied research, but rather that we should be thoughtful about ensuring that research is performed with intention and purpose. The words we choose to discuss policy challenges matter. I will be moving away from the idea of “demand-driven” research in my own writing and focusing on the more inclusive idea of purpose-driven and intentional research going forward.
Stefan’s closing remarks are particularly impactful. We are in a time of geopolitical upheaval, and it is more important than ever before that Canada learn how to translate its research excellence into real-world impact. Stefan comments that “constraint is often a precursor to creativity”, and it certainly feels to me that out public sector is waking up to the fact that doing more of the same thing we have been to date will not work, and that creativity is needed. This need for creativity in the face of constraint is the basis of my certainty that the change enacted by RNS will come to the rest of Canada in some form or another.
The strategic plan developed by RNS should be a central point of reference for all Canadian research funding programs that seek to achieve real-world impact from the research they fund. It reflects precisely what they ask of their researchers: a clear connection between the actions they are undertaking, and the needs of the society that they serve.




Brilliant on every level. Bravo Nova Scotia. I'm especially impressed with Stefan's (and others involved) work on qualitative measures of impact or success. Most find ways of avoiding that work because it is usually hard, but it is crucial.