The Center for International Forestry Research (CIFOR and World Agroforestry (ICRAF) joined forces in 2019, leveraging a combined 65 years’ experience in research on the role of forests and trees in solving critical global challenges.
For scientists, numbers can mean everything. In agroforestry research for development, the number of farmers implementing new agroforestry technologies is important, as are the number of hectares under tree-based systems and the number of tons of product they produce or carbon dioxide they sequester. But it is the number of publications which appear in high impact journals that can sometimes matter most when it comes to whether or not scientists can continue their research.
The World Agroforestry Centre, like many other centres in the CGIAR Consortium, encourages its scientists to publish in journals that have a high impact factor, and this is one element that is considered in assessing scientists’ performance. Journal impact factor is calculated annually for each scientific journal based on the average number of times its articles have been referenced in other articles.
“But,” says Mehmood Hassan, Head of Capacity Development at the Centre, “the impact factor of a journal surely only indicates the quality of the journal and not that of the scientists who publishes in it”.
Recently, an editorial appeared in Science, arguing that using impact factor as a metrics for assessing scientists’ productivity is usually misleading. Editor in Chief, Bruce Alberts, outlines how journal impact factor distorts the evaluation of scientific research and blocks innovation. He argues that it can bias journals against publishing important papers in fields such as social sciences and ecology, and encourages scientists to venture into areas of science that are already highly populated; acting as a disincentive to pursue risky and potentially groundbreaking work and blocks innovation
The editorial appeared in response to the release of the San Francisco declaration on research Assessment (DORA), which says journal impact factor must not be used as "a surrogate measure of the quality of individual research articles, to assess an individual scientist's contributions, or in hiring, promotion, or funding decisions." DORA’s recommendations to improve the way scientific publications are assessed have already been endorsed by more than 150 leading scientists and 75 scientific organizations, including the American Association for the Advancement of Science (the publisher of Science).
According to Hassan, it is important to recognize that the funders of research increasingly push for business concepts in research management, such as research productivity. “Research managers in their pursuit of results-based management rely on metrics which help them enhance productivity within their organizations,” says Hassan.
“Funders of the CGIAR increasingly drive centres to publish in journals that have high impact factors through taking into consideration impact factor in the allocation of funding to individual centres,” explains Hassan. “The centres’ research managers tend to pass on this measure of performance to their scientists who now strive to publish in high impact journals.”
While impact factor might assess the quality of a journal rather than the scientist who publishes in it, the fact remains that research managers and funders need appropriate metrics to assess the quality and credibility of research that an individual scientist produces.
Hassan suggests an alternative to assessing the quality of research produced by individual scientists could be to use the h- and i10- indices of individual scientists. The former refers to the largest number (h) of publications having at least the same number (h) of citations. The latter refers to the number of publications with at least 10 citations.
The DORA recommendations call for leaders of the scientific enterprise to analyze the scientific contributions of researchers by actually reading a selection of their publications instead of leaving this to journal editors.
Read: Impact factor distortion, editorial in Science 17:340, May 2013
See also: World Bank. 2003. The CGIAR at 31: A Meta-Evaluation of the Consultative Group on International Agricultural Research, Vol. 2: Technical Report. p 135. Operations Evaluation Department. Washington, DC: World Bank