Summary
This paper addresses the issue of relevancy when tackling the problem of the evaluation of research published in Social Science journals. This evaluation initialy relies on a critical selection of the databases scientists use. To implement relevant disciplinary evaluations, the method also needs to be scientific, ethical, replicable, comprehensive, flexible, transparent, accessible, incentive, productive, updatable and “internationalizable”. This qualitative approach takes into account the current global environment of research. Our method - introducing these criteria - consists in selecting the bases (either bases from the Institute for Scientific Information or not) scientists favour, in crossing them to elaborate new lists of journals, in testing them, in launching a life-size survey among scientists. This method stands as a prerequisite for further applications. Beyond this rather constructivist approach, such evaluations of research can benefit to all the actors participating in the process of the dissemination of knowledge. The need for an international cooperation in coming up with relevant evaluation criteria and indexes is put forward when implementing these sets of evaluation. The appendix presents a case study on French sociology.
Similar content being viewed by others
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Jeannin, P., Devillard, J. Implementing relevant disciplinary evaluations in the social sciences. Scientometrics 63, 121–144 (2005). https://doi.org/10.1007/s11192-005-0206-2
Issue Date:
DOI: https://doi.org/10.1007/s11192-005-0206-2