Éric Archambault

Guest Contributor
July 28, 2008

Better bibliometrics needed for greater accountability of research expenditures

By Éric Archambault and Grégoire Côté

Canada spends about $25 billion on R&D every year, of which $5 billion are public funds spent by the federal government alone. Compared to our relative benevolence at spending these sums, we are incredibly stingy when it comes to spending time and money to acquire hard data on the outputs and immediate impacts of science. The most cost-effective indicators of these outputs and impacts are scientometrics ("measurement of science") and technometrics ("measurement of technology") assessments based on bibliometric methods (i.e., counts of bibliographic records of, for example, papers or patents). However, these are seldom seen in the Canadian science policy and evaluation landscape.

The primary goal of research is to produce knowledge and the main dissemination method for new knowledge in the natural sciences and in many of the social sciences is by publication of peer-reviewed papers. Research has maximum impact when the new knowledge it produces is widely adopted — in other words when papers are widely cited. Both the number of papers and the citations received by papers are measured objectively through bibliometric methods.

Although these methods are the most direct and cost-effective way to measure scientific outputs and impacts, they are not widely used in Canada to evaluate scientific activities, programs or even for resource allocation. This may change as the UK is about to play a leading role by bringing bibliometric methods to the fore in research fund allocation in the next research assessment exercise (RAE). An increasing number of actors in Canada are paying lip service to bibliometrics, such as by proposing to include systematic bibliometric data in their performance reporting plans or as a research evaluation tool. But despite a slow growth in demand, obstacles still stand in the way of widespread adoption of these methods.

Bibliometrics has been widely used in the field of scientometrics for almost 50 years. Scientometrics, which rapidly obtained the title of "the science of science," is now a mature field in its own right. International conferences are held regularly, where scientists present replicable, rigorous experiments, models and findings either based on or pushing new developments in scientometric and bibliometric methods.

Despite these developments, members of academia and scientists regularly cast doubt on scientometric methods. Their criticisms are usually based on hearsay or lack of empirical evidence. In our opinion, this is due to the traditional model of science which is autonomous and self-regulating, in contrast to the more recent trend wherein science, like any other costly activity for society, has to be held accountable. The old model relies on peer-review and requires an act of faith on the part of society. That is, we are asked to believe that as long as funds are made available, science through self-regulation will contribute to social and economic progress.

Peer-review and subliminal bibliometrics

Although one often hears scientists say that only peer-review can guarantee an effective evaluation of grant applications, grant adjudication is often accompanied by an artful form of subliminal bibliometrics. It is not uncommon to hear statements such as "she published many more papers than he did," or "he has published three papers in journal X which has a large impact factor." The problem with this sneaky use of bibliometrics is that it lacks reproducibility and rigour.

This is precisely the aspect that the UK RAE wants to tackle, by promoting a selection process based on independent, rigorous, performance-based and transparent criteria. In this sense, bibliometric methods could help the peer-review process concentrate on what it does best — evaluate the research project for which funds are being sought.

Pork-barrel science policy

Many decisions made in science and technology policy rely on guts and intuition more than on hard evidence of science dynamics and local strengths and weaknesses. Moreover, deciding precisely where activities will be conducted and where investments in research facilities will be made is often based on power plays or on urban legends.

It is not hard to find deep misconceptions about research excellence and intensity. For example, some argue that Canada is a nanotechnology paradise whereas hard evidence shows that we have basically missed the boat in this field — our research intensity ranks 49th out of the 50 leading countries. Then there's the case of optimistic promoters in Quebec City back in 2002, who claimed that the city was a world leader in biophotonics, given local strengths in health science and in photonics. By contrast, quantitative data show that Quebec City's biophotonics efforts were only a minute dot on the scientific map.

Evidence-based evaluation & decision making

Research councils could make greater use of bibliometric methods to show whether their funding is helping researchers achieve the expected level of scientific excellence. They could also see whether the attribution of funds follows the evolution of scientific fields and whether the country is keeping pace globally. Science-based departments could see whether their investments are aligned with global trends and fill knowledge gaps. Bibliometrics also provides evidence of whether there is a significant uptake of new knowledge by researchers in other communities and whether government scientists collaborate in knowledge production with industry and universities.

Despite its apparent simplicity, bibliometrics is a complex science. It is therefore important to avoid amateur and off-the-shelf uses of bibliometric data that often lead to errors and sustain misconceptions. There is currently too little knowledge of this field in Canada. Training and educating practitioners, users, evaluators and managers is crucial as bibliometric methods are neither simple nor tools to be used in isolation. Scientometrics and technometrics are at least as potent in raising questions as in answering them. They should be part of a practice-driven research evaluation continuum.

Éric Archambault is President of Montreal-based Science-Metrix and Grégoire Côté is the bibliometric associate.


Other News






Events For Leaders in
Science, Tech, Innovation, and Policy


Discuss and learn from those in the know at our virtual and in-person events.



See Upcoming Events










You have 1 free article remaining.
Don't miss out - start your free trial today.

Start your FREE trial    Already a member? Log in






Top

By using this website, you agree to our use of cookies. We use cookies to provide you with a great experience and to help our website run effectively in accordance with our Privacy Policy and Terms of Service.