Abstract
Recruitment and professorial appointment procedures are crucial for the administration and management of universities and higher education institutions in order to guarantee a certain level of performance quality and reputation. The complementary use of quantitative and objective bibliometric analyses is meant to be an enhancement for the assessment of candidates and a possible antidote for subjective, discriminatory and corrupt practices. In this paper, we present the Vienna University bibliometric approach, offering a method which relies on a variety of basicindicators and further control parameters in order to address the multidimensionality of the problem and to foster comprehensibility. Our “top counts approach” allows an appointment committee to pick and choose from a portfolio of indicators according to the actual strategic alignment. Furthermore, control and additional data help to understand disciplinary publication habits, to unveil concealed aspects and to identify individual publication strategies of the candidates. Our approach has already been applied to 14 professorial appointment procedures (PAP) in the life sciences, earth and environmental sciences and social sciences, comprising 221 candidates in all. The usefulness of the bibliometric approach was confirmed by all heads of appointment committees in the life sciences. For the earth and environmental sciences as well as the social sciences, the usefulness was less obvious and sometimes questioned due to the low coverage of the candidates’ publication output in the traditional citation data sources. A retrospective assessment of all hitherto performed PAP also showed an overlap between the committees’ designated top candidates and the bibliometric top candidates to a certain degree.


Similar content being viewed by others
Notes
Bibliometrics: The Leiden Manifesto for research metrics (http://www.nature.com/news/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351; Accessed 15.05.2015).
The standard deviation is provided only upon request.
Alternatively, one could use the mean JIF value of the last 10 JCR—Editions.
Q1 publications are publications in journals that are ranked in the top 25 % of the assigned category or categories in JCR according to the JIF.
A paper dealing with these aspects and summarizing the results has been accepted for the forthcoming edition of the ISSI Conference 2015 in Istanbul “Exploration of the bibliometric coordinates for the field of ‘Geography’.
Ibid.
The same analysis can also be performed by using the percentile values from Incites (WoS Category, fractional count) with no considerable differences. Nevertheless, the suggested approach above guarantees the same thresholds for all applying candidates within a specific PAP. Distorting field differences are considered in Table 3 (Additional Data).
References
Abramo, G., Cicero, T., & D’Angelo, C. A. (2012a). The dispersion of research performance within and between universities as a potential indicator of the competitive intensity in higher education systems. Journal of Informetrics, 6(2), 155–168.
Abramo, G., Cicero, T., & D’Angelo, C. A. (2012b). Revisiting the scaling of citations for research assessment. Journal of Informetrics, 6(4), 470–479.
Abramo, G., & D’Angelo, C. A. (2011). Evaluating research: From informed peer review to bibliometrics. Scientometrics, 87(3), 499–514.
Abramo, G., D’Angelo, C. A., & Rosati, F. (2014). Career advancement and scientific performance in universities. Scientometrics, 98(3), 891–907.
Adams, J., Gurney, K. A., & Marshall, S. (2007). Profiling citation impact: A new methodology. Scientometrics, 72, 325–344.
Allesina, S. (2011). Measuring nepotism through shared last names: The case of Italian academia. PLoS ONE, 6(8), e21160. doi:10.1371/journal.pone.0021160.
Alonso, S., Cabrerizo, F. J., Herrera-Viedma, E., & Herrera, F. (2009). H-index: A review focused in its variants, computation and standardization for different scientific fields. Journal of Informetrics, 3(4), 273–289. doi:10.1016/j.joi.2009.04.001.
Bach, J. F. (2011). On the proper use of bibliometrics to evaluate individual researchers. Académie des sciences. http://www.academie-sciences.fr/activite/rapport/avis170111gb.pdf. Accessed February 5, 2015.
Bar-Ilan, J. (2008). Which h-index? A comparison of WoS Scopus and Google Scholar. Scientometrics, 74(2), 257–271.
Bonacorsi, A. (2014). Institutions of public science and new search regimes. In D. Jansen & I. Pruisken (Eds.), The changing governance of higher education and science (Vol. 43)., Higher Education Dynamics Berlin: Springer.
Bornmann, L., de Moya-Anegón, F., & Leydesdorff, L. (2012). The new excellence indicator in the World Report of the SCImago Institutions Rankings 2011. Journal of Informetrics, 6, 333–335.
Combes, P., Linnemer, L., & Visser, M. (2008). Publish or peer-rich? The role of skills and networks in hiring economics professors. Labour Economics, 15, 423–441.
Cora-Bramble, D. (2006). Minority faculty recruitment, retention and advancement: Applications of a resilience-based theoretical framework. Journal of Health Care for the Poor and Underserved, 17(2), 251–255.
Costas, R., Bordons, M., Van Leeuwen, T. N., & Van Raan, A. F. J. (2009). Scaling rules in the science system: influence of field-specific citation characteristics on the impact of individual researchers. Journal of the American Society for Information Science and Technology, 60(4), 740–753.
Cronin, B. (1984). The citation process. The role and significance of citations in scientific communication. London: Taylor Graham.
De Bellis, N. (2009). Bibliometrics and citation analysis: From the Science citation index to cybermetrics. Lanham, MD: Scarecrow Pr.
De Nooy, W., Mrvar, A., & Batagelj, V. (2005). Exploratory social network analysis with Pajek. In M. Granovetter (Ed.), Structural analysis in the social sciences (No. 27). Cambridge: Cambridge University Press. ISBN-13: 9781107002388
Ferlazzo, F., & Sdoia, S. (2012). Measuring nepotism through shared last names: Are we really moving from opinions to facts? PLoS ONE, 7(8), e43574. doi:10.1371/journal.pone.0043574.
Garfield, E. (2005). The agony and the ecstasy. The history and meaning of the journal impact factor. http://garfield.library.upenn.edu/papers/jifchicago2005.pdf.
Gast, K., Kuzon, W., & Waljee, J. F. (2014). Bibliometric indices and academic promotion within plastic surgery. Plastic and Reconstructive Surgery, 134(5), 838e–844e. doi:10.1097/PRS.0000000000000594.
Glänzel, W. (2008). Seven myths in bibliometrics. About facts and fiction in quantitative science studies. In H. Kretschmer & F. Havemann (Eds.), Proceedings of WIS fourth international conference on webometrics, informetrics and scientometrics & Ninth COLLNET meeting, Berlin, Germany. http://www.collnet.de/Berlin-2008/GlanzelWIS2008smb.pdf. Accessed January 22, 2015.
Glänzel, W. (2014). Analysis of co-authorship patterns at the individual level. Transinformação, 26(3), 229–238.
Glänzel, W., & Debackere, K. (2003). On the opportunities and limitations in using bibliometric indicators in a policy relevant context. In R. Ball (Ed.), Bibliometric analysis in science and research: Applications, benefits and limitations (pp. 225–236). Germany: Forschungszentrum Jülich.
Glänzel, W., & Debackere, K. (2007). On the “multi-dimensionality” of ranking and the role of bibliometrics in university assessment. Paper presented at the international colloquium on “Ranking and Research Assessment in Higher Education”. Brussels, Belgium.
Glänzel, W., Debackere, K., Thijs, B., & Schubert, A. (2006). A concise review on the role of author self-citations in information science, bibliometrics and science policy. Scientometrics, 67(2), 263–277.
Glänzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171–193.
Glänzel, W., Thijs, B., & Schlemmer, B. (2004). A bibliometric approach to the role of author self-citations in scientific communication. Scientometrics, 59(1), 63–77.
Glänzel, W., & Wouters, P. (2013). The do’s and don’ts of individual-level bibliometrics. Position statement at the 14th ISSI conference, Vienna, 15–18 July 2013. In Gorraiz et al. (Eds.), Proceedings of the 14th international conference on scientometrics and informetrics, Vol. 1. Vienna: ISSI. http://www.issi2013.org/Images/ISSI_Proceedings_Volume_I.pdf. Accessed February 5, 2015.
Gonzalez-Pereira, B., Guerrero-Bote, V., & Moya-Anegón, F. (2009). The SJR indicator: A new indicator of journals’ scientific prestige. http://arxiv.org/abs/0912.4141. Accessed May 5, 2015.
Google Scholar Blog (2011). “Google Scholar Citations Open To All”, Google, 16 November 2011. Accessed January 22, 2015.
Gorraiz, J., Gumpenberger, C., Schlögl, C., & Wieland, M. (2012b). On the temporal stability of Garfield‘s Impact Factor and its suitability to identify hot papers. In Proceedings of STI 2012 Montreal. 17th international conference on science and technology indicators, Vol 1, pp. 319–332.
Gorraiz, J., Purnell, P., & Glänzel, W. (2013). Opportunities and limitations of the book citation index. The American Society for Information Science and Technology, 64(7), 1388–1398.
Gorraiz, J., Reimann, R., & Gumpenberger, C. (2011). The Importance of Bilateral and Multilateral Differentiation in the Assessment of International Collaboration—a case study for Austria and six countries. In E. Noyons, P. Ngulube, & J. Leta (Eds.), Proceedings of ISSI 2011—The 13th international conference on scientometrics and informetrics (pp. 236–248), Durban, South Africa, 4–7 July 2011.
Gorraiz, J., Reimann, R., & Gumpenberger, C. (2012a). Key factors and considerations in the assessment of international collaboration: A case study for Austria and six countries. Scientometrics, 91(2), 417–433.
Gumpenberger, C., Wieland, M., & Gorraiz, J. (2012). Bibliometric practices and activities at the University of Vienna. Library Management, 33(3), 174–183.
Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.
Holden, G., Rosenberg, G., & Barker, K. (2005). Bibliometrics: A potential decision making aid in hiring, reappointment, tenure and promotion decisions. Social Work in Health Care, 41(3–4), 67–92.
Laudel, G. (2002). What do we measure by co-authorships? Research Evaluation, 11, 3–15.
Lotka, A. J. (1926). The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16(12), 317–323.
Martin, B. (2009). Academic patronage. International Journal for Educational Integrity, 5(1), 3–19.
Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.
Moed, H. F. (2010). The source normalized impact per paper is a valid and sophisticated indicator of journal citation impact. Journal of the American Society for Information Science and Technology, 62(1), 211–213.
Moed, H. F. (2011). Measuring contextual citation impact of scientific journals. Journal of Informetrics, 4(3), 265–277.
Persson, O., Danell, R., & Wiborg Schneider, J. (2009). How to use Bibexcel for various types of bibliometric analysis. In F. Åström, R. Danell, B. Larsen, & J. Schneider (Eds.), Celebrating scholarly communication studies: A Festschrift for Olle Persson at his 60th birthday (pp. 9–24). Leuven: International Society for Scientometrics and Informetrics.
Persson, O., Glänzel, W., & Danell, R. (2004). Inflationary bibliometric values: The role of scientific collaboration and the need for relative indicators in evaluative studies. Scientometrics, 60(3), 421–432.
Price, E. G., Gozu, A., Kern, D. E., Powe, N. R., Wand, G. S., Golden, S., et al. (2005). The role of cultural diversity climate in recruitment, promotion, and retention of faculty in academic medicine. Journal of General Internal Medicine, 20(7), 565–571.
Schubert, A., & Braun, T. (1986). Relative indicators and relational charts for comparative assessment of publication output and citation impact. Scientometrics, 9(5–6), 281–291.
Schubert, A., & Braun, T. (1996). Cross-field normalization of scientometric indicators. Scientometrics, 36(3), 311–324.
Seglen, P. (1992). The skewness of science. Journal of the American Society for Information Science, 4, 628–638.
Shockley, W. (1957). On the Statistics of individual variation of productivity in research laboratories. Proceedings of the Institute of Radio Engineers, 45(3), 279–290. doi:10.1109/JRPROC.1957.278364.
Trotman, C. A., Bennett, E., Scheffler, N., & Tulloch, J. C. (2002). Faculty recruitment, retention, and success in dental academia. American Journal of Orthodontics and Dentofacial Orthopedics, 122(1), 2–8.
Van Den Brink, M., Benschop, Y., & Jansen, W. (2010). Transparency in academic recruitment: A problematic tool for gender equality? Organization Studies, 31(11), 1459–1483.
Van Der Ploeg, F., & Veugelers, R. (2008). Towards evidence-based reform of European universities. Cesifo Economic Studies, 54(2), 99–120.
Van Raan, A. F. J. (2004). Measuring science. Capita selecta of current main issues. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research. The use of publication and patent statistics in studies of S&T systems. Dordrecht: Kluwer Academic Publishing.
Vinkler, P. (2010). The evaluation of research by scientometric indicators. Oxford: CP, Chandos Publishing—ISBN: 1-84334-572-2.
Warner, J. (2000). A critical review of the application of citation studies to the Research Assessment Exercises. Journal of Information Science, 26(6), 453–459.
Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.
Wouters, P., Glänzel, W., Gläser, J., & Rafols, I. (2013). Individual-level evaluative bibliometrics-the politics of use and abuse. Brief report at the STI 2013 plenary on the methodological aspects of individual-level bibliometrics. Berlin, September 2013.
Zinovyeva, N., & Bagues, M. (2012). The role of connections in academic promotions. Business Economics Working Papers from Universidad Carlos III, Instituto sobre Desarrollo Empresarial “Carmen Vidal Ballester”. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2136888. Accessed January 22, 2015.
Acknowledgments
The authors wish to thank Ralph Reimann, who helped to establish the described “top counts approach” at the University of Vienna. Furthermore, we wish to acknowledge our unit for quality assurance as well as all the heads of professorial appointment committees for their much valued support and input.
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Gorraiz, J., Gumpenberger, C. A flexible bibliometric approach for the assessment of professorial appointments. Scientometrics 105, 1699–1719 (2015). https://doi.org/10.1007/s11192-015-1703-6
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-015-1703-6