By Ron Freedman
You'd think that the data needed for S&T policy analysis would continually improve over time; that we'd have better data to inform decision making in the 2010s than we did in the 2000s, 1990s, etc. You'd be wrong. Some of this is due to short-sighted penny pinching. Much is down to eroding S&T policy analysis interests and skills in government and universities. And still more to the absence of a national plan for obtaining needed data in an efficient way.
On the penny pinching front, due to 2012 budget cuts we lost StatCan's Survey of Intellectual Property Commercialization in the Higher Education Sector. No big deal, except that we've also lost the ability to track the commercialization outputs and outcomes of Canada's annual $11.5 billion of research spending in universities, hospitals and colleges. Whereas the benchmark StatCan report tracked the performance of 125 institutions and put the data into the public domain, its nearest counterpart — the AUTM study — tracks only 38 institutions and charges money for the data. So, henceforth we'll be pretty much flying blind on the higher education commercialization front.
Likewise, last year we also lost a key survey of Intellectual Property Management in Federal Science-based Departments, so there won't be any data on the commercialization outputs and outcomes of $2.5 billion of S&T spending in the federal realm.
In addition to tracking commercialization data, IP survey data also measured knowledge flows among sectors with a view to understanding outcomes. A large part of the value of data such as these is that over time they provide a moving picture of how the innovation system is performing, not merely an occasional snapshot. So, even if a higher education IP survey were to be resurrected in (say) five years time, we'll have lost multiple years of trend data.
Another loss in 2012 was StatCan's annual University and College Academic Staff System (UCASS) report. This little-known survey provided the only consistent national data on faculty numbers. The faculty counts are crucial for developing many research metrics (e.g. number of papers published per full-time faculty, number of invention disclosures per full-time faculty, etc.).
It's actually quite shocking to look at the statistical picture in aggregate. Innovation-related data are in good (or bad) company; StatCan reports that it currently has a total of 350 active surveys and statistical programs, against 332 inactive surveys and statistical programs. How long before there are more inactive than active?
None of this is to argue that cancelled surveys should be resurrected as they were. A number had become bloated with requirements for unneeded data and were due for major overhauls that would have saved money and time. Some had simply outlived their usefulness.
So it's apparent we're not doing so well on the "supply side" of innovation data. What of the "demand side"? After all you could argue that if there's no demand for the data, why pay for the supply?
It is hard not to conclude that few people in the policy community in government or universities are interested in innovation data. The evidence is that over a year after the 2012 cuts nobody in government has stepped forward to resurrect the "disappeared" innovation studies, although federal-provincial discussions are under way. Even university executives seem not to care what is happening in their backyards.
Furthermore, the analytical community remains appallingly ignorant of key innovation data. Hard to believe but true. To this day nobody knows (cares?) how many medium-sized technology companies there are in Canada. Forget start-ups and early-stage firms, the true engines of economic growth are our medium-sized companies that have the potential to grow into tomorrow's multinationals. We don't even know how many of them there are, in which industry sectors, which parts of the country, etc. And yet the country is throwing hundreds of millions of taxpayer dollars annually into innovation programs for SMEs ... without knowing even the most rudimentary facts about the population of companies.
Also consider the "three key pillars underpinning the STI ecosystem" that were highlighted in the Science, Technology and Innovation Council's latest State of the Nation report: "business innovation, knowledge development and transfer, and talent development and deployment". As things now stand, when it comes time for the 2014 State of the Nation Report STIC will have to scale back its analysis of knowledge development and transfer ... the data will be limited. The concern won't have disappeared, but the data will have eroded.
This is not a StatCan issue per se. Nearly all of its innovation research and statistics work is deemed peripheral to its core mandate (national accounts and census), which means that historically it has relied on other government funders and customers to pay the shot. A major problem is that StatCan's costs and charges are, to be charitable, "on the high side" and with current staffing levels it can't respond to data requests in a timely way. It is also the case that something needs to be done to break its monopoly on statistics. For instance, as in the US, much of its research could be contracted to the private sector and universities at far less cost. (In this scenario StatCan's role would simply be quality control and distribution). But this is a secondary issue for the moment.
A paradox in all this is that better coordination of data requests on the part of customers — primarily federal and provincial government departments — could yield enough savings to reinstate lost surveys and simultaneously commission important new work. Bluntly, data customers are wasting a great deal of taxpayers' money and are tying up statistical resources by duplicating requests for the same data and not sharing them. If an annual plan were coordinated among the large public sector users they could obtain the same or better data in a more timely way and at lower overall cost than they are currently paying.
It's hard to know what is more troubling: that the data are eroding, that the policy community seems to have lost interest in innovation data, or that plans to improve the situation are unfolding too slowly. Until federal and provincial officials come up with a national data plan that would preserve the important current data and develop a roadmap for future data, we'll still be flying blind.
Ron Freedman is a partner with The Impact Group and co-publisher of RE$EARCH MONEY.