Abstract
One often uses the average citation impact factor in order to perform international comparisons between the levels of scientific performance within given disciplines. In averaging over all (or all cited) papers one may gives undue weight to papers with few citations while, in fact, the standing of a country within a given field would be better assessed by looking only at the “successful” papers in that discipline. The present papers suggests that one should do so by averaging citations only over the ten (or twenty) percent of the most cited papers in a discipline and use these in order to establish a ranking between countries. The case of Israel is used as an illustration of this approach.
Similar content being viewed by others
References
Braun, T., Glänzel, W. andSchubert, A. (1985).Scientometric Indicators. A 32 Country Comparison of Publication Productivitys and Citation Impact, Singapore, Philadelphia.
Braun, T., Maczelka, H. andSchubert, A. (1994a), World science in the eighties. National Performances in publication output and citation impact, 1985–1989 versus 1980–1984. Part II. Life Sciences, Engineering and Mathematics,Scientometrics, 31: 3–30.
Braun, T., Glänzel, W., Maczelka, H. andSchubert, A. (1994b). World science in the eighties. National Performances in publication output and citation impact, 1985–1984. Part I. All science fields combined, Physics, and Chemistry,Scientometrics, 29: 299–334.
Hirst, G. (1978), Discipline impact factors: a method for determining core journal lists,Journal of the American Society for Information Science, 29: 171–172.
Irvine, J. andMartin, B.R. (1989) International comparisons of scientific performance revisited,Scientometrics, 15: 369–392.
Marshakova-Shaikevich, I. (1996), The Standard Impact Factor as an evaluation tool of science fields and scientific journals,Scientometrics, 35: 283–290.
Marshakova-Shaikevich, I. (1995),Dynamics of research fields in science and social sciences: a bibliometric analysis. Proceedings of the Fifth International Conference of the International Society for Scientometrics and Informetrics, 323–330, Chicago.
Schwartz, S. andLopez-Hellin, J. (1996). Measuring the impact of scientific publications: the case of the biomedical sciences,Scientometrics, 35: 119–132.
Schubert, A., Glänzel, W., Braun, T. (1988), Against absolute methods. Relative Scientometric Indicators and Relational Charts as Evaluation Tools. In:A Van Raan, (Ed.),Handbook of Quantitative Studies of Science and Technology, 137–176.
Schubert, A., Glänzel, W., Braun, T. (1989), Scientometric datafiles. A comprehensive set of indicators on 2649 journals and 96 countries in all major fields and subfields 1981–1985.Scientometrics, 16: 3–478.
Vinkler, P. (1988), An attempt of surveying and classifying bibliometric indicators for scientometric purposes,Scientometrics, 13: 239–259.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Czapski, G. The use of deciles of the citation impact to evaluate different fields of research in Israel. Scientometrics 40, 437–443 (1997). https://doi.org/10.1007/BF02459291
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF02459291