RAM the PI-BETA, C3PO - what the H-STAR happened to my promotion application? Or: The pros and cons of bibliometric evaluations of researchers
- Views Icon Views
- Chapter PDF
- Share Icon Share
- Search Site
Michael C. Calver, 2013. "RAM the PI-BETA, C3PO - what the H-STAR happened to my promotion application? Or: The pros and cons of bibliometric evaluations of researchers", Grumpy Scientists: The Ecological Conscience of a Nation, Daniel Lunney, Pat Hutchings, Harry Recher
Download citation file:
Bibliometrics - methods to quantitatively analyse the quality and impact of scientific or technical literature - are now a central part of the management of modern science. Through them, research managers seek to encourage quality and productivity and use scarce research funds effectively. Researchers are ranked on a range of quantitative assessments to measure the quality of their work, and the results influence employment prospects, grants, tenure and promotions. Unfortunately, researchers anxious to maximise their prospects may concentrate on good scores, not good science. This could change what they research, what they publish and where they publish. Natural history, baseline research and research of regional (but not international) significance could be marginalised despite the clear benefits of such research for monitoring, hypothesis generation and local management. These difficulties are compounded by inappropriate applications of common bibliometric statistics, such as the persistence of the discredited views that the quality of a paper may be judged by the journal in which it appears or that a simple citation count alone indicates the merit of a paper or a researcher. This paper takes a role-playing approach, centred on a fictitious interview as part of a promotion application, to explore some of the uses and misuses of bibliometrics and how researchers can present their case honestly, while defending against abuses and championing unfashionable but valuable areas of research.