Abstract
Recent years have seen enormously increased interest in the comparative evaluation of research quality in the UK, with considerable resources devoted to ranking the output of academic institutions relative to one another at the sub-discipline level, and the disposition of even greater resources dependent on the outcome of this process. The preferred methodology has been that of traditional peer review, with expert groups of academics tasked to assess the relative worth of all research activity in ‘their’ field. Extension toinstitutional evaluation of a recently refined technique ofjournal ranking (Discipline Contribution Scoring) holds out the possibility of ‘automatic’ evaluation within a time-frame considerably less than would be required using methods based directly on citation counts within the corpus of academic work under review. This paper tests the feasibility of the technique in the sub-field of Business and Management Studies Research, producing rankings which are highly correlated with those generated by the much more complex and expensive direct peer review approach. More generally, the analysis also gives a rare opportunity directly to compare the equivalence of peer review bibliometric analysis over a whole sub-field of academic activity in a non-experimental setting.
Similar content being viewed by others
References
HEFCE.A Report for the UFC on the Conduct of the 1992 Research Assessment Exercise, Higher Education Funding Council for England: Bristol, June 1993.
HEFCE,Guidelines for the Conduct of the 1995 Research Assessment Exercise, Higher Education Funding Council for England: Bristol, 1994.
P. R. Thomas, Sive effects in the assessment of discipline-contribution scores: an example from the social sciences,Scientometrics, 33(2) (1995) 203–220.
D. S. Watkins, Changes in the nature of UK small business research, 1980–1990. Part One: Changes in Producer Characteristics,Small Business and Enterprise Development, 1(3) (1994) 28–31.
D. S. Watkins, Changes in the Nature of UK Small Business Research, 1980–1990. Part Two: Changes in the nature of the output,Small Business and Enterprise Development, 2(1) (1995) 59–66.
J. Taylor, Measuring research performance in Business and Management Studies in the United Kingdom: The 1992 research assessment exercise,British Journal of Management, 5(4) (1994) 275–288.
C. Oppenheim, The correlation between citation counts and the 1992 research assessment exercise ratings for British Library and Information Science university departments,Journal of Documentation, 51(1) (1995) 18–27.
A. M. Colman, D. Dhillon, B. Coulthard, A Bibliometric evaluation of the research performance of British university politics departments: Publications in leading journals,Scientometrics, 32 (1) (1995) 49–66.
A. F. J. Van Raan, Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight studies,Scientometrics, 36(3) (1996) 397–420.
H. Peters, A. Van Raan, On determinants of citation scores: a case study in Chemical Engineering,Journal of the American Society for Information Science, 45(1) (1994) 39–49.
P. R. Thomas,op. cit. 3.
G. M. Gilbert, Referencing as persuation,Social Studies of Science, 17 (1977) 113–122.
H. Peters, A. Van Raan,op. cit. 10.
J. G. Shaw Article-by-article citation analysis of medical journals,Scientometrics, 12 (1987) 101–110.
S. E. Wiberley, Journal rankings from citation studies: a comparison of national and local data from social work,Library Quarterly, 52(4) (1982) 348–359.
P. Doreian, Measuring the relative standing of disciplinary journals,Information Processing and Management, 24(1) (1988) 45–56.
P. R. Thomas,op. cit. 3.
H. F. Moed, W. Burger, J. Frankfort, A. Van Raan, The use of bibliometric data for the measurement of university research performance,Research Policy, 14 (1985) 131–149.
E. J. Rinica, C. de Lange, H. Moed Measuring national output in Physics: Delimination problems,Scientometrics, 28(1) (1993) 89–110.
A. Bekavec, J. Petrak, Z. Buneta Citation behavior and place of publication in the authors from the scientific periphery: a matter of quality?Information Processing and Management, 30(1) (1994) 33–42.
P. Doreian A Measure of standing for citation networks within a wider environment,Information Processing and Management, 30(1) (1994) 21–31.
R. Coe, I. Weinstock, Evaluating the management journals: a second look,Academy of Management Journal, 27 (1984) 660–666.
L. R. Gomez-Meija, D. B. Balkin, Determinants of faculty pay: an agency theory perspective,Academy of Management Journal, 35 (1992) 921–955.
A. M. Colman et al., op. cit. 8.
M. M. Extejt, J. E. Smith, The behavioral sciences and management: an evaluation of relevant journals,Journal of Management, 16 (1990) 539–551.
R. T. Gillett, Serious anomalies in the UGC comparative evaluation of the research performance of psychology departments,Bulletin of the British Psychological Society, 40 (1987) 42–49.
G. Johnes, Research performance indicators in the university sector,Higher Education Quarterly, 42 (1988) 54–71.
J. B. Bavelas, The social psychology of citation,Canadian Psychological Review, 19(2) (1978) 158–163.
J. L. Johnson, P. M. Podsakoff, Journal influence in the field of management: an analysis using Salancik's Index in a dependency network,Academy of Management Journal, 37(5) (1994) 1392–1407.
K. E. Clark,America's Psychologists: A Survey of a Growing Profession, Washington: American Psychological Association, 1957.
F. Narin,Evaluative Bibliometrics, Cherry Hill, New Jersey: Computer Horizons, 1976.
L. R. Gomez-Meija,et al,op. cit. 23.
A. Schubert, T. Braun, Reference standards for citation based assessments,Scientometrics, 26 (1993) 21–35.
A. M. Colman et al., op. cit. 8.
A. J. Nederhof, A. F. J. Van Raan, A bibliometric analysis of six economics research groups: A comparison with peer review,Research Policy, 22 (1993) 353–368.
A. F. J. Van Raan,op. cit. 9.
A. Sandison, The use of older literature and its obsolescence,Journal of Documentation, 17 (1971) 184–189.
E. Garfield, Citations-to divided by items-published gives the impact factor,Current Contents, 15 (1972) 6–7.
L. M. Raisig, Mathematical evaluation of the scientific serial,Science, 131 (1960) 1417.
P. Doreian,op. cit. 16..
C. He, M. L. Pao, A discipline-scientific journal selection algorithm,Information Processing and Management, 22 (1986) 405–416.
G. Hirst, Discipline impact factor: a method for determining core journal lists,Journal of the American Society for Information Science, 29 (1982) 171–172.
P. Pichappan, Identification of mainstream journals of science specialty: a method using the discipline-contribution score,Scientometrics, 27 (1993) 179–193.
P. R. Thomas,op. cit. 3.
P. R. Thomas,op. cit. 3. Appendix A.
J. Taylor,op. cit. 6..
J. L. Johnson, P. M. Podsakoff,op. cit. 29.
A. M. Colman et al., op. cit. 8.
C. Oppenheim,op. cit. 7.
A. J. Nederhof, A. F. J. Van Raan,op. cit. 35.
A. J. Nederhof, A. F. J. Van Raan,op. cit. 35 p. 413.
HEFCE,op. cit. 1.
ABRC,Peer Review: A Report of the Advisory Board for the Research Councils from the Working Party on Peer Review (‘Boden Report’), London: ABRC, 1990.
J. S. Armstrong, Peer review for journals: Evidence on quality control, fairness, and innovation,Science and Engineering Ethics, 3(1) (1997) 63–84.
J. Taylor,op. cit. 6.
ISI,Journal Citation Reports for 1993, Philidelphia: Institute for Scientific Information.
G. S. Howard, D. A. Cole, S. E. Maxwell, Research productivity in psychology based on publication in the journals of the American Psychological Association,American Psychologist, 42 (1987) 975–986.
C. Oppenheim,op. cit. 7.
L. B. Seng, P. Willett, The citedness of publications by United Kingdom schools and departments of library and information studies,Journal of Information science, 21(1) 1995.
D. S. Watkins,op. cit. 4.
D. S. Watkins,op. cit. 5.
B. Fender,Speech to the Association of Business Schools Conference on RAE96, Harrogate, February 1997.
DfEE,Report of the National Committee of Inquiry into Higher Education: Higher Education in the Learning Society (‘Dearing Report’), London: Department for Education and Employment, July 1997.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Thomas, P.R., Watkins, D.S. Institutional research rankings via bibliometric analysis and direct peer review: A comparative case study with policy implications. Scientometrics 41, 335–355 (1998). https://doi.org/10.1007/BF02459050
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF02459050