Abstract
Traditional bibliometric indicators are considered too limited for some research areas such as humanities and social sciences because they mostly reveal a specific aspect of academic performance (quantity of publications) and tend to ignore a significant part of research production. The frequent misuses (e.g. improper generalizations) of bibliometric measures results in a substantial part of the research community failing to consider the exact nature of bibliometric measures. This study investigates the links between practices for assessing academic performance, bibliometric methods’ use and underlying values of research quality within the scientific community of University of Lausanne, Switzerland. Findings reveal four researcher profiles depending on research orientations and goals, ranging from those using “pure” quantitative tools to those using more subjective and personal techniques. Each profile is characterized according to disciplinary affiliation, tenure, academic function as well as commitment to quality values.
Similar content being viewed by others
Notes
http://www.dfg.de/en/magazine/excellence_initiative/index.html. Accessed 9 Jan 2012.
http://www.agence-nationale-recherche.fr/investissementsdavenir/AAP-IDEX-2010.html. Accessed 9 Jan 2012.
http://www.hefce.ac.uk/research/ref/. Accessed 9 Jan 2012.
http://www.obs-ost.fr/. Accessed 19 Sept 2012.
http://www.unil.ch/recherche/page53293.html. Accessed 9 Jan 2012.
http://www-01.ibm.com/software/analytics/spss/. Accessed 19 Sept 2012.
Lemmas and generalized categories are marked with an asterisk.
See Table 1, last two questions.
References
Barnett, A. H., Ault, R. W., & Kaserman, D. L. (1988). The rising incidence of coauthorship in economics: further evidence. Review of Economics and Statistics, 70, 539–543.
Cole, F. J., & Eales, N. B. (1917). The history of comparative anatomy. Part I: a statistical analysis of the literature. Science Progress, 11, 578–596.
Coutrot, L. (2008). Sur l’usage récent des indicateurs bibliométriques comme outil d’évaluation de la recherche scientifique, Bulletin de méthodologie sociologique, 100. Mis en ligne le 01 octobre 2008. http://bms.revues.org/index3353.html. Accessed 14 March 2012.
Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131–152.
Endersby, J. W. (1996). Collaborative research in the social sciences: multiple authorship and publication credit. Social Science Quarterly, 77, 375–392.
Esterle, L. (2007). La compétition entre institutions de recherche et la mesure de l’excellence (pp. 307–317). In Management de la recherche: De Boeck Université.
Filliatreau, G. (2008). Bibliométrie et évaluation en sciences humaines et sociales: une brève introduction. Revue d’histoire moderne et contemporaine, 55(4), 61–66.
Franceschet, M. (2010). Ten good reasons to use the Eigenfactor™ metrics. Information Processing and Management, 46(5), 555–558.
Garfield, E. (1979). Citation indexing. Its theory and application in science, technology and humanities. New York: Wiley.
Gingras, Y. (2008). La fièvre de l’évaluation de la recherche Du mauvais usage de faux indicateurs. Revue d’histoire moderne et contemporaine, 55(4), 67–79.
Glänzel, W., & Schoepflin, U. (1994). Little scientometrics, big scientometrics … and beyond? Scientometrics, 30, 375–384.
Harland, T., & Pickering, N. (2011). Values is higher education teaching. London: Routledge.
Harman, K. (2010). Faculty values and expectations and their implications for teaching, learning and the organization of higher education institutions. In E. Baker, P. Peterson & B. McGaw (Eds.), International encyclopedia of education (3rd ed., Vol. 4, pp. 433–440). Oxford: Elsevier Limited
Harvey, L., & Green, D. (1993). Defining Quality. Assessment and Evaluation in Higher Education, 18, 9–35.
Harzing, A. W. (2007). Publish or Perish. http://www.harzing.com/pop.htm. Accessed 19 Sept 2012.
Hayati, Z., & Ebrahimy, S. (2009). Correlation between quality and quantity in scientific production: a case study of Iranian organizations from 1997 to 2006. Scientometrics, 80(3), 627–638.
Hemlin, S. (1993). Scientific quality in the eyes of the scientist. A questionnaire study. Scientometrics, 27(1), 3–18.
Hemlin, S., & Montgomery, H. (1990). Scientists’ conceptions of scientific quality. An interview study. Science Studies, 1, 73–81.
Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of United States of the America, 102, 16569–16572.
Hofstede, G. (2001). Culture’s consequences: Comparing values, behaviors, institutions, and organizations across nations. Thousand Oaks: Sage Publications, Inc.
Hood, W., & Wilson, C. (2001). The literature of bibliometrics, scientometrics, and informetrics. Scientometrics, 52(2), 291–314.
Horne, R., Petrie, K., & Wessely, S. (2009). H-index pathology: implications for medical researchers and practitioners. British Medical Journal, 339, 1447–1448.
Katharaki, M., & Katharakis, G. (2010). A comparative assessment of Greek universities’ efficiency using quantitative analysis. International Journal of Educational Research, 49, 115–128.
Kroeber, A. L. & Kluckhohn, C. (1952). Culture: a critical review of concepts and definitions. Cambridge (Mass): Papers of the Peabody Museum of American archeology and ethnology, Harvard University XLVII.
Lamont, M. (2009). How Professors think. Cambridge: Harvard University Press.
Lebart, L., & Morineau, A. (1982). SPAD : système portable pour l’analyse des données. Paris: Cesia.
Mallard, G., Lamont, M., & Guetzkow, J. (2009). Fairness as appropriateness: negotiation epistemological differences in peer review. Science Technology Human Values, 34, 573–606.
Moed, H. F., De Bruin, R. E., & Van Leeuwen, Th. N. (1995). New bibliometric tools for the assessment of national research performance: database description, overview of indicators and first applications. Scientometrics, 33, 381–422.
Haлимoв, B. B. & Myльчeнкo, З. M. (1969). Hayкoмeтpия. Изyчeниe нayки кaк инфopмaциoннoгo пpoцecca. Mocквa: Hayкa.
Ortega, J. L., López-Romero, E., & Fernández, I. (2011). Multivariate approach to classify research institutes according to their outputs: the case of the CSIC’s institutes. Journal of Informetrics, 5, 323–332.
Price, D. (1963). Little science, big science. New York: Columbia.
Reeves, C. A., & Bednar, D. (1994). Defining quality: alternatives and implications. The Academy of Management Review, 19(3), 419–445.
Rostaing, H. (1996). La bibliométrie et ses techniques. Toulouse: Sciences de la société.
Smith, D. R. (2012). Impact factors, scientometrics and the history of citation-based research. Scientometrics, 92(2), 419–427.
Staropoli, A. (1991). The evaluation of research. In U. Dahllöf, J. Harris, M. Shattock, & A. Staropoli (Eds.), Dimesions of evaluation in higher education (pp. 86–100). London: Jessica Kingsley Publisher.
Stensaker, B., & Harvey, L. (2011). Accountability in higher education. Global perspectives on trust and power. London: Routledge.
Van Raan, A. F. J. (1997). Scientometrics: state-of-the-art. Scientometrics, 38(1), 205–218.
Vieira, E. S., & Gomes, J. A. N. F. (2010). A research impact indicator for institutions. Journal of Informetrics, 4(4), 581–590.
Vinkler, P. (1997). Relations of relative scientometric impact indicators. The relative publication strategy index. Scientometrics, 40(1), 163–169.
Vinkler, P. (1998). General performance indexes calculated for research institutes of the Hungarian Academy of Sciences based on scientometric indicators. Scientometrics, 41(1–2), 185–200.
Vinkler, P. (2004). Adalékok a tudománymetria néhány kérdésének megértéséhez. Magyar tudomány, 49, 789–793.
Vinkler, P. (2008). Tudománymetriai kutatások Magyarországon. Magyar Tudomány, 11, 1372–1380.
Vinkler, P. (2009). The π-index: a new indicator for assessing scientific impact. Journal of Information Science, 35(5), 602–612.
Vinkler, P. (2010). Indicators are the essence of scientometrics and bibliometrics. Scientometrics, 85, 861–866.
Acknowledgments
The questionnaire was developed with the contribution of Mario Konishi, PhD.
Author information
Authors and Affiliations
Corresponding author
Appendix: Questionnaire used for the study
Appendix: Questionnaire used for the study
-
Q1. How would you define a good scientist in your field of discipline?
-
Q2. According to you, what a quality research in your field of discipline?
-
Q3. Have you ever had the opportunity to assess researchers in the following procedures/situations:
-
(a)
Appointment or promotion of a faculty member
-
(b)
Appointment or promotion of an intermediate staff member
-
(c)
Decisions of funding someone in order to encourage an academic career
-
(d)
Awarding of a prize or distinction
-
(a)
-
Q4. Have you ever had the opportunity to evaluate research papers or projects (e.g. scientific articles, conference papers, dissertations degrees, funding projects, etc.) in the following procedures/situations:
-
(a)
Revising journal article, a book, a book chapter for publication
-
(b)
Preliminary assessment or public criticism of a symposium or conference
-
(c)
Evaluation and grading of a bachelor’s or master’s thesis or a PhD dissertation
-
(d)
Research project assessment related to request for funding
-
(a)
-
Q5. Do you use bibliometric indicators to assess a scientific journal?
-
Q6. Do you use bibliometric indicators to assess researcher’s profile?
Rights and permissions
About this article
Cite this article
Czellar, J., Lanarès, J. Quality of research: which underlying values?. Scientometrics 95, 1003–1021 (2013). https://doi.org/10.1007/s11192-012-0928-x
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-012-0928-x