Measuring the long-term impact of innovation policy
A close look at the ineffectiveness of the KPIs of incumbent Canadian innovation support systems, and recommendations for quantitative metrics that align incentives toward long-term economic benefit
One of the main challenges with building effective innovation policy in Canada is a complete lack of data with which to make evidence-based decisions. Last week I wrote in detail about reforms to the SR&ED program and implementation of a patent box regime as a means to support Canadian innovation. A major theme of my thoughts on both of those issues revolves around addressing this challenge, and what actually constitutes useful metrics for tracking the long-term impact of innovation policy.
In the course of accessing support for the early stages of building my company, I was consistently surprised at how little information is collected across all Canadian innovation support structures. In cases where information was collected, I was disappointed to see that it was lacking in the detail that would be required to make it actionable and that the timelines involves in data collection were usually misaligned with the timelines to expected impact. In short, most of the innovation-related data collection that happens either prioritizes collection of the wrong metrics, or considers them over the wrong timeframe, or both.
In particular, with 90% of the value in S&P now estimated to be based on intangibles such as patents and data, the lack of long-term tracking of intangible assets arising from publicly subsidized research means that Canadian innovation support is missing most of the value that it could be creating. The result is clear, with Canada having the dubious distinction of joining a small handful of developed countries that" “have seen the assessed quality of their overall entrepreneurial environment slip into the “less than sufficient” category” according to a recent report by the Global Entrepreneurship Monitor.
In this article, I give a brief overview of the failings of the current data collections systems, and suggest a new set of metrics that should be included in administration and tracking of Canadian innovation support programs.
Current data collection frameworks
In the course of applying and reporting on the results of most early-stage innovation supports as a private company (IRAP and CanExport in particular), the main criteria of concern were the number of full-time jobs that would be created, and the projected revenues received, within a 12-18 month timeframe. There are two core problems with this.
First, in the early stages of deep tech development, there will probably be no revenues within 18 months. The early stages of building are focused on creation of value in the form of intellectual property. If the time horizon over which impact is assessed is limited in this way, any investment made in deep tech will look like a write-off, but only because the long-term (5-10-year) impact of the value created was not captured.
Second, short-term job creation should not be conflated with innovation and long-term economic benefit. Job creation is a side effect of a healthy innovation ecosystem; it should not be the end goal. Use of job creation as a proxy for effective innovation policy leads us to subsidize building battery factories instead of investing in the technology itself, trading an enormously valuable intellectual property developed in Canada for a mostly foreign-owned factory that is unlikely to ever see a return on investment.
Mitacs is an extremely valuable early-stage innovation support that is one of the few Canadian funding systems that operates in the valley of death, but the metrics that they collect provide them no insight at all into the long-term impact of having provided that funding. Their metrics focus on the value of training received in the course of the grant, with only cursory consideration of intellectual property created.
Many incubators require annual reports on member company activity, but these offer similarly little in terms of long-term tracking potential. In addition to very low compliance rates in reporting due to a lack of consequence for failure to do so, metrics collected are focused on job creation and revenues, with little to no consideration for intellectual property beyond reporting the number of new patents filed. Without considering the quality, ownership, and licensing status of those patents over their lifetime, this information is neither actionable nor useful.
To date, I am unaware of any innovation support program that considers data in their definition of intellectual property, in spite of the fact that the recent AI boom has elevated the value of high-quality, curated training data by orders of magnitude.
What all of the metrics that are currently collected have in common is that they are intermediate, short-term proxies for the actual desired long-term impact. Job creation, last-year’s revenues, and raw number of patents filed are all easily quantifiable numbers that appear at first glance to be useful metrics, but in reality they provide very little forward guidance.
When intermediate metrics are used as a proxy for the target outcome, they become the target, with programs optimizing for the metric instead of the outcome. As soon as a proxy metric is used to allocate funding or make short-term decisions, it will be manipulated and will cease to be a useful means to make further decisions (see for example the h-index in academia, which now incentivizes academic fraud instead of measuring career impact as intended).
Tracking Intellectual Property
Intellectual property is where data collection needs to focus in order to assess the long-term value and impact of Canadian innovation support spending.
An enormous number of patents area filed by Canadian institutions each year, but in spite of my best efforts I am unable to find high-quality data that records where those patents end up. Public facing resources typically list only the owner/assignee and are woefully out of date. Information about licensing is severely limited, except to note that patents are more than twice as likely to be licensed to a US-based firm than to a domestic startup company. As of the time of that survey, now more than 10 years out of date, the proportion of patents that were licensed out to the US was trending upward.
Without a clear picture of the flow of IP, from ideation all the way through to commercial impact, it is not possible to assess the impact of innovation support funding.
To address this, receipt of Canadian funding to support innovation should come with some strings attached, both at the academic and commercial stages of IP development. These strings should focus not on restricting the flow of IP, but rather on ensuring that it is understood and documented.
Academic IP is particularly difficult to track past its origins, since in most cases the university retains ownership and private, NDA-protected licenses are signed as part of the research funding agreement that gave rise to it. With only the owner (the university, most of the time) a matter of public record regardless of who it is licensed to, it is very difficult to determine who the licensee is from public data, and it is all but impossible to track sub-licensees. Depending on the details of the license, it can be effectively the same as ownership, but will never make it into the public record.
Licensing or assignment of IP from universities resulting from research conducted with public funding should contractually require reporting on that IP by the licensee or assignee, directly to an entity within the federal government, for a period of 10 years following receipt of the funding. Moreover, this requirement should be viral, in the sense that it should be contractually inherited by any acquirer, (sub)licensee, or assignee of that IP. This should be a required element in any license for IP generated using public funding signed by a university tech transfer office, which is to say that universities should not be empowered to issue licenses that do not include this element.
The same requirement should apply to companies that receive public subsidies for R&D, for example through SR&ED and IRAP. A condition of receiving funding should be long-term reporting on the impact of that funding, constructed to as to ensure that this requirement is inherited by every entity that is ever granted access to that IP. By attaching viral reporting requirements to the IP at the source, we ensure that either we understand exactly how and where the resulting IP is used, or we recover the investment made in producing it.
It is important that the reporting requirement be to the federal government and not to the licensing university, since even in cases of contractual breach, university tech transfer offices do not have the funding to enforce and are rarely empowered to do so. This is a perfect addition to the mandate of the Canadian Innovation Corporation, should it ever exist.
Metrics to be reported by all those required should include:
All commercial entities that currently have access to the IP, including any new licenses issued or assignment executed
Gross revenues arising from products or services that incorporate or use the IP
Gross revenues arising from any licenses issued on that IP
Obviously this is private business information and should be protected and not publicized except in anonymized and aggregated form.
Failure to report, or failure to contractually pass on reporting requirements to licensees and assignees, should trigger repayment of some or all of the public funding that led to the creation of the IP, by whatever entity in the chain failed in their requirement. By maintaining reporting requirements across all entities in the chain, it also creates accountability as information can be cross-checked and audited where inconsistencies arise.
Most importantly, however, is that these metrics be tracked for information purposes only, and that there be no quotas or outcomes attached to them. The point is not to penalize or reward any specific performance, but rather to understand the long-term impact of Canadian-subsidized R&D. Only when we have a clear picture of the flow of intellectual property and data can we begin to make evidence-based policy decisions to support innovation in Canada. I suspect that we will find that most Canadian value creation is done on behalf of other countries. While this is not a comprehensive set of metrics that should be collected, it is by far the most important to my mind. Further inspiration can be drawn from the Global Entrepreneurship Monitor report.