Do business support programs have any impact? A comment on Rossi's rules

Guest Contributor
September 20, 2012

Commentary

By Dr Margaret Dalziel

In 1987, sociologist Peter Rossi distilled his experience evaluating social programs in a few simple ‘metallic laws' where the durability of the metal indicates the law's robustness. The most robust of these laws is the iron law, which states that: "The expected value of any net impact assessment of any large-scale social program is zero. The Iron Law arises from the experience that few impact assessments of large-scale social programs have found that the programs in question had any net impact. The law also means that, based on the evaluation efforts of the last 20 years, the best a priori estimate of the net impact assessment of any program is zero, i.e., that the program will have no effect."

The remaining laws are variations on the theme of the iron law. For example, the stainless steel law suggests that the better designed an evaluation, the more likely it is to find a net impact of zero. Rossi's laws are still in circulation because many econometricians have found confirming empirical results. And this is so regardless of whether the program is a social program designed to help people, or a business support program designed to help businesses. Many rigourous evaluations of such programs find marginal, if any impacts.

On the other hand, many programs can point to success stories that demonstrate impact and can attract clients, dedicated employees, and investors. So who's right? Are program stakeholders deluded or incredibility patient, hoping against the evidence that the program will someday have an impact? Or is it possible that many programs are having an impact that the econometricians are unable to capture?

No doubt everyone's right some of the time. But to better understand what's happening we need better data. Econometricians often use secondary data, and so benefit from large sample sizes and objective data. But secondary data, having been collected for other purposes, may not be entirely appropriate for the task at hand.

For example, many business support programs aim to serve new ventures, but new ventures will not show up in data that is limited to publicly traded firms. Success stories are just the opposite. Success stories benefit from rich, pertinent data, but employ small and highly biased samples.

There are three additional reasons for which econometricians may not find an impact, even when some consider a program worthy. First, econometricians consider average impacts. If a program serves 100 clients and only 30 benefit, should it be cancelled? Answering that question would depend on the nature and extent of the benefits, the costs of the program, both overall costs and the costs to the 100 clients, and available alternatives.

Second, econometricians must select a measure of impact. Programs whose impacts depend on client needs will have impacts on multiple dimensions. Assessments that measure impacts on only one or two dimensions will miss impacts that weren't considered. For example, an assessment of a business support program that considers only impact on company revenues will miss impacts on innovativeness, emission reductions, and profitability.

Third, impact assessments occur at specific points in time. Some impacts may take a long time to materialize and so may be measurable only after the assessment has taken place.

The Evidence Network assesses the impact of business research, innovation, and support programs using primary data collected from client firms. This approach addresses the challenges faced by econometricians because the data is pertinent, fine-grained, multi-dimensional, and timely. We recently assessed the impact of a business support program on the revenues of 54 (61%) of the firms that had participated in the program. Only 23 of the 54 firms reported a positive impact on revenues. The mean impact is €0.38 million, the median impact is zero, and the total impact on revenues is €20.5 million. The total cost of the program was €1.2 million for a benefit-cost ratio, which considers only one dimension of benefits (impact on revenues) and all costs, of 17.1. This program creates net value notwithstanding the fact that an econometric analysis of the program would be unlikely to capture it. As Einstein said, "The problems that exist in the world today cannot be solved by the level of thinking that created them."

Dr Margaret Dalziel is a professor of innovation and entrepreneurship at the Telfer School of Management, University of Ottawa, and VP Research, The Evidence Network. She is currently serving on the Council of Canadian Academies Expert Panel on the Socio-Economic Impacts of Innovation Investments.


Other News






Events For Leaders in
Science, Tech, Innovation, and Policy


Discuss and learn from those in the know at our virtual and in-person events.



See Upcoming Events










You have 1 free article remaining.
Don't miss out - start your free trial today.

Start your FREE trial    Already a member? Log in






Top

By using this website, you agree to our use of cookies. We use cookies to provide you with a great experience and to help our website run effectively in accordance with our Privacy Policy and Terms of Service.