This paper investigates the fitness-for-purpose and soundness of bibliometric parameters for measuring and elucidating the research performance of individual researchers in the field of education sciences in Switzerland. In order to take into account the specificities of publication practices of researchers in education sciences, the analyses are based on two separate databases: Web of Science and Google Scholar. Both databases show a very unequal distribution of the individual research output, and the indicators used to measure research performance (quantity of publications and citation impact) from the two data sources are highly positively correlated. However, individual characteristics of the researchers, such as age, gender and academic position, that serve to explain the great variance in research performance, can only be identified if the Web of Science is used as a benchmark of research performance. The results indicate that Google Scholar is so inclusive that it impedes a meaningful interpretation of the data. However, the Web of Science inclusion policy for journals is also associated with certain shortcomings that put some researchers at an unjustified disadvantage. Therefore, problems currently exist in regard to both citation databases when used to benchmark individual research performance.
This is a preview of subscription content,to check access.
Access this article
Similar content being viewed by others
Because Web of Science only includes publications that have been published in scientific journals that are listed in the Social Sciences Citation Index, even a purely quantitative analysis of these publications implies a qualitative element, as the vast majority of included articles will have been subjected to peer-review prior to acceptance for publication.
Citations may be meaningless or have negative connotations and citation impacts may be inflated by "citation cartels" and self-cites.
A heavy national focus/use of the national language was also observed in the German study by Dees (2008): 88 % of the publications analysed were written in German.
Investigations in related social science research areas suggest that the skewed distribution for research performance is not solely explained by the fact that researchers differ in the types of publication they prefer and are more or less likely to be included in databases on that account. Researchers with a high level of publishing activity in one particular type of publication (monograph, book chapter, journal article) tend to have higher publishing outputs in respect of other types of publication as well (Puuska 2010). Nor is the skewed distribution likely to be due to a quantity versus quality trade-off; for instance, a study by Bernauer and Gilardi (2010) looking at political science shows that researchers who publish more articles also tend to have higher rates of publication in journals with a higher impact factor.
In theory, this form of analysis would also be feasible with the data in this paper. In practice, the low overall number of observable professors is a prohibitive factor.
Van Raan et al. (2011) additionally find that non-English articles are also less often quoted leading to lower levels in the impact measures.
They were identified based on a directory of the conference of Swiss university presidents (CRUS annuaire; updated version February 2010). The directory provides information about all professors working at Swiss universities by subject field.
The small sample size of just 51 professors, even when taking into account the size of Switzerland (less than 8 million inhabitants) can be explained by two factors. Firstly, the higher education sector in Switzerland is smaller than in other countries of comparable population size and secondly, professors working in teacher training institutions are not counted, as teacher education (for teachers in compulsory schooling) is organized in specific universities of teacher training that do not have the right to award PhDs and are therefore less research bound. The advantage, however, in the case of Switzerland, is that all universities are considered to be “research universities” and therefore all professors working at these institutions should be assessed on the base of research excellence.
For instance references to publishers or university homepages.
Due to our limited database, methods accounting for different types of research output (see De Witte and Rogge 2010) can not be applied.
The correlation persists if the publications published in both databases are taken out of the calculations.
The self-citation rate is unknown. If it is similar to the self-citation rate in Web of Science, the associated bias is negligible.
The Gini index is a standardized measure of the space between the Lorenz curve and the proportional line. If the Lorenz curve is equal to the proportional line (perfectly equal distribution) the value of the index is zero and for a totally unequal distribution (in our case only one professor would publish) the measure would be one. Therefore, the closer the value is to one, the more unequal is the distribution of the research output.
Another possible explanation might be that long-standing researchers benefit from structural privileges due to their increasing fame/reputation (e.g. as regards allocation of research funding or inclusion of an article in a journal due to a position on the editorial board).
Van Aalst (2010)'s findings indicate, however, that the obscured correlations (due to background noise) may be partly reduced by information about the specific types of publication (books, book chapters, dissertations, conference papers). This paper does not provide a more detailed attribution of Google Scholar publications because, firstly, attribution to a specific type of publication in itself tends to be the consequence of an arbitrary decision, and, secondly, not all of the links in Google Scholar actually enable access to a document (which, however, would be necessary for attribution to a specific form of publication).
Aaltojärvi, I., Arminen, I., Auranen, O., & Pasanen, H.-M. (2008). Scientific productivity, web visibility and citation patterns in sixteen Nordic sociology departments. Acta Sociologica, 51, 5–22.
Abramo, G., D’Angelo, C. A., & Caprasecca, A. (2009). Gender differences in research productivity: A bibliometric analysis of the Italian academic system. Scientometrics, 79, 517–539.
Adler, R., Ewing, J., & Taylor, P. (2009). Citation statistics. Statistical Science, 24, 1–14.
Archambault, É., Vignola-Gagne, É., Côté, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68, 329–342.
Bakanic, V., McPhail, C., & Simon, R. J. (1987). The manuscript review and decision-making process. American Sociological Review, 52, 631–642.
Bernauer, T., & Gilardi, F. (2010). Publication output of Swiss political science departments. Swiss Political Science Review, 16, 279–303.
Bonaccorsi, A., & Daraio, C. (2003). Age effects in scientific productivity: The case of the Italian National Research Council (CNR). Scientometrics, 58, 49–90.
Borrego, A., Barrios, M., Villarroya, A., & Ollé, C. (2010). Scientific output and impact of postdoctoral scientists: A gender perspective. Scientometrics, 83, 93–101.
Bozeman, B., & Gaughan, M. (2011). How do men and women differ in research collaborations? An analysis of the collaborative motives and strategies of academic researchers. Research Policy, 40, 1393–1402.
Budd, J. M., & Magnuson, L. (2010). Higher education literature revisited: Citation patterns examined. Research in Higher Education, 51, 294–304.
Butler, L. (2002). A list of published papers is no measure of value. Nature, 419, 877.
Carayol, N., & Matt, M. (2006). Individual and collective determinants of academic scientists’ productivity. Information Economics and Policy, 18, 55–72.
Commission, European. (2010). Assessing Europe’s university-based research (EUR 24187 EN). Brussels: European Commission.
Corby, K. (2001). Method or madness? Educational research and citation prestige. Libraries and the Academy, 1, 279–288.
Corby, K. (2003). Constructing core journal lists: Mixing science and alchemy. Libraries and the Academy, 3, 207–217.
Cusin, C., Grossenbacher, S., & Vögeli-Mantovani, U. (2000). FER-Studie “Prospective de la recherche en éducation en Suisse”: Teilstudie Erziehungswissenschaften an Schweizer Universitäten (Orientierung, Produktivität und Nachwuchsförderung). Aarau: SKBF.
D’Amico, R., Vermigli, P., & Canetto, S. S. (2011). Publication productivity and career advancement by female and male psychology faculty: The case of Italy. Journal of Diversity in Higher Education, 4, 175–184.
De Witte, K., & Rogge, N. (2010). To publish or not to publish? On the aggregation and drivers of research performance. Scientometrics, 85, 657–680.
Dees, W. (2008). Innovative scientometric methods for a continuous monitoring of research activities in educational science. In H. Kretschmer & F. Havemann (Eds.), Proceedings of WIS 2008, Fourth International Conference on Webometrics, Informetrics and Scientometrics & Ninth COLLNET Meeting (pp. 1–10). Berlin: Ges. für Wissenschaftsforschung.
Earp, V. J. (2010). A bibliometric snapshot of the journal of higher education and its impact on the field. Behavioral & Social Sciences Librarian, 29, 283–295.
Fairbairn, H., Holbrook, A., Bourke, S., Preston, G., Cantwell, R., & Scevak, J. (2009). A profile of education journals. In P. Jeffrey (Ed.), AARE 2008 conference papers collection. The Australian Association of Research in Education. Retrieved June 9, 2011, from http://www.aare.edu.au/08pap/fai08605.pdf.
Fernández-Cano, A., & Bueno, Á. (1999). Synthesizing scientometric patterns in Spanish educational research. Scientometrics, 46, 349–367.
Fröhlich, G. (1999). Das Messen des leicht Messbaren. Output-Indikatoren, Impact-Masse: Artefakte der Szientometrie? In J. Becker & W. Göhring (Eds.), Kommunikation statt Markt. Zu einer alternativen Theorie der Informationsgesellschaft (pp. 27–38). Sankt Augustin: GMD—Forschungszentrum Informationstechnik GmbH.
García-Pérez, M. (2010). Accuracy and completeness of publication and citation records in the Web of Science, PsycINFO, and Google Scholar: A case study for the computation of h indices in psychology. Journal of the American Society for Information Science and Technology, 61, 2070–2085.
Gonzalez-Brambila, C., & Veloso, F. M. (2007). The determinants of research output and impact: A study of Mexican researchers. Research Policy, 36, 1035–1051.
Graber, M., Launov, A., & Wälde, K. (2008). Publish or perish? The increasing importance of publications for prospective economics professors in Austria, Germany and Switzerland. German Economic Review, 9, 457–472.
Haddow, G., & Genoni, P. (2010). Citation analysis and peer ranking of Australian social science journals. Scientometrics, 85, 471–487.
Hall, B. H., Mairesse, J., & Turner, L. (2005). Identifying age, cohort and period effects in scientific research productivity: Discussion and illustration using simulated and actual data on French physicists. NBER Working Paper Series, 11739. Retrieved April 13, 2011, from http://www.nber.org/papers/w11739.
Harzing, A.-W. (2007). Publish or Perish. Retrieved Sept. 15, 2010, from http://www.harzing.com/pop.htm.
Harzing, A.-W. K., & van der Wal, R. (2008). Google Scholar as a new source for citation analysis. Ethics in Science and Environmental Politics, 8, 61–73.
Hicks, B. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44, 193–215.
Hicks, D., Tomizawa, H., Saitoh, Y., & Kobayashi, S. (2004). Bibliometric techniques in the evaluation of federally funded research in the United States. Research Evaluation, 13, 78–86.
Hicks, D., & Wang, J. (2011). Coverage and overlap of the new social sciences and humanities journal lists. Journal of the American Society for Information Science and Technology, 62, 284–294.
Hornbostel, S., & Keiner, E. (2002). Evaluation der Erziehungswissenschaft. Zeitschrift für Erziehungswissenschaft, 5, 634–653.
Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of the American Society for Information Science and Technology, 59, 1819–1828.
Hunter, L. A., & Leahey, E. (2010). Parenting and research productivity: New evidence and methods. Social Studies of Science, 40, 433–451.
Jacsó, P. (2008). Google Scholar revisited. Online Information Review, 32, 102–114.
Jansen, D., Wald, A., Franke, K., Schmoch, U., & Schubert, T. (2007). Third party research funding and performance in research. Kölner Zeitschrift für Soziologie und Sozialpsychologie, 59, 125–149.
Jensen, P., Rouquier, J.-B., & Croissant, Y. (2009). Testing bibliometric indicators by their prediction of scientists promotions. Scientometrics, 78, 467–479.
Jokić, M., & Ball, R. (2006). Qualität und Quantität wissenschaftlicher Veröffentlichungen: Bibliometrische Aspekte der Wissenschaftskommunikation. Jülich: Forschungszentrum Jülich GmbH.
Keiner, E. (1999). Erziehungswissenschaft 1947–1990: Eine empirische und vergleichende Untersuchung zur kommunikativen Praxis einer Disziplin. Weinheim: Deutscher Studien Verlag.
Knorr, K. D., Mittermeir, R., Aichholzer, G., & Waller, G. (1979). Individual publication productivity as a social position effect in academic and industrial research units. In F. M. Andrews (Ed.), The effectiveness of research groups in six countries (pp. 55–94). Cambridge: Cambridge University Press.
Krampen, G., Becker, R., Wahner, U., & Montada, L. (2007). On the validity of citation counting in science evaluation: Content analyses of references and citations in psychological publications. Scientometrics, 71, 191–202.
Kroc, R. J. (1984). Using citation analysis to assess scholarly productivity. Educational Researcher, 13, 17–22.
Kyvik, S. (1996). Child care, research collaboration, and gender differences in scientific productivity. Science, Technology and Human Values, 21, 54–71.
Larivière, V., Vignola-Gagné, É., Villeneuve, C., Gélinas, P., & Gingras, Y. (2011). Sex differences in research funding, productivity and impact: An analysis of Québec university professors. Scientometrics, 87, 483–498.
Leinenkugel, P., Dees, W., & Rittberger, M. (2011). Abdeckung erziehungswissenschaftlicher Zeitschriften in Google Scholar. In J. Griesbaum, T. Mandel, & C. Womser-Hacker (Eds.), Information und Wissen: Global, sozial und frei? (pp. 160–170). Boizenburg: Hülsbusch.
Levin, S. G., & Stephan, P. E. (1991). Research productivity over the life cycle: Evidence for academic scientists. American Economic Review, 81, 114–132.
Linmans, A. J. M. (2010). Why with bibliometrics the humanities does not need to be the weakest link: Indicators for research evaluation based on citations, library holdings, and productivity measures. Scientometrics, 83, 337–354.
Long, J. S., Allison, P. D., & McGinnis, R. (1993). Rank advancement in academic careers: Sex differences and the effects of productivity. American Sociological Review, 58, 703–722.
Luce, T. S., & Johnson, D. M. (1978). Rating of educational and psychological journals. Educational Researcher, 7, 8–10.
McNally, G. P. (2010). Scholarly productivity, impact, and quality among academic psychologists at group of eight universities. Australian Journal of Psychology, 62, 204–215.
Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology, 58, 2105–2125.
Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.
Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66, 81–100.
Neuhaus, C. (2010). Vergleichende Analysen von Forschungsleistungen: Forschungsgruppen im Spiegel bibliometrischer Indikatoren. Baden-Baden: Nomos.
Paludkiewicz, K., & Wohlrabe, K. (2010). Qualitätsanalyse von Zeitschriften in den Wirtschaftswissenschaften: Über Zitationsdatenbanken und Impaktfaktoren im Online-Zeitalter. Ifo Schnelldienst, 63, 18–28.
Puuska, H.-M. (2010). Effects of scholar’s gender and professional position on publishing productivity in different publication types: Analysis of a Finnish university. Scientometrics, 82, 419–437.
Rauber, M., & Ursprung, H. W. (2008). Life cycle and cohort productivity in economic research: The case of Germany. German Economic Review, 9, 431–456.
Rey, O. (2009). Quality indicators and educational research publications: Which publications count? Dossier d’actualité de la VST, 46. Retrieved June 14, 2011, from http://www.inrp.fr/vst/LettreVST/english/46-june-2009_en.php?onglet=integrale.
Rokach, L., Kalech, M., Blank, I., & Stern, R. (2011). Who is going to win the next association for the advancement of artificial intelligence fellowship award? Evaluating researchers by mining bibliographic data. Journal of the American Society for Information Science and Technology, 62, 2456–2470.
Sax, L. J., Hagedorn, L. S., Arredondo, M., & DiCrisi, F. A, I. I. I. (2002). Faculty research productivity: Exploring the role of gender and family-related factors. Research in Higher Education, 43, 423–446.
Schulze, G. G., Warning, S., & Wiermann, C. (2008). Zeitschriftenrankings für die Wirtschaftswissenschaften: Konstruktion eines umfassenden Metaindexes. Perspektiven der Wirtschaftspolitik, 9, 286–305.
Shin, E.-J. (2004). Measuring the impact of electronic publishing on citation indicators of education journals. Libri, 54, 221–227.
Shin, J. C., & Cummings, W. K. (2010). Multilevel analysis of academic publishing across disciplines: Research preference, collaboration, and time on research. Scientometrics, 85, 581–594.
Smart, J. C. (1983). Perceived quality and citation rates of education journals. Research in Higher Education, 19, 175–182.
Smeby, J.-C., & Try, S. (2005). Departmental contexts and faculty research activity in Norway. Research in Higher Education, 46, 593–619.
Stack, S. (2004). Gender, children and research productivity. Research in Higher Education, 45, 891–920.
Togia, A., & Tsigilis, N. (2006). Impact factor and education journals: A critical examination and analysis. International Journal of Educational Research, 45, 362–379.
van Aalst, J. (2010). Using Google Scholar to estimate the impact of journal articles in education. Educational Researcher, 39, 387–400.
van Leeuwen, T. (2006). The application of bibliometric analyses in the evaluation of social science research. Who benefits from it, and why it is still feasible. Scientometrics, 66, 133–154.
van Ours, J. (2009). Will you still need me: When I’m 64? De Economist, 157, 441–460.
van Raan, A. F. J., van Leeuwen, T. N., & Visser, M. S. (2011). Severe language effect in university rankings: Particularly Germany and France are wronged in citation-based rankings. Scientometrics, 88, 495–498.
Wellington, J., & Torgerson, C. J. (2005). Writing for publication: What counts as a ‘high status, eminent academic journal’? Journal of Further and Higher Education, 29, 35–48.
Winkelmann, R. (2008). Economic analysis of count data (5th ed.). New York: Springer.
We gratefully acknowledge Alexander Botte, Werner Dees, Daniel Munich, Olivier Rey and two anonymous referees for their helpful comments and suggestions.
About this article
Cite this article
Diem, A., Wolter, S.C. The Use of Bibliometrics to Measure Research Performance in Education Sciences. Res High Educ 54, 86–114 (2013). https://doi.org/10.1007/s11162-012-9264-5