, Volume 107, Issue 2, pp 685–699 | Cite as

Taking scholarly books into account: current developments in five European countries

  • Elea Giménez-Toledo
  • Jorge Mañana-Rodríguez
  • Tim C. E. Engels
  • Peter Ingwersen
  • Janne Pölönen
  • Gunnar Sivertsen
  • Frederik T. Verleysen
  • Alesia A. Zuccala


For academic book authors and the institutions assessing their research performance, the relevance of books is undisputed. In spite of this, the absence of comprehensive international databases covering the items and information needed for the assessment of this type of publication has urged several European countries to develop custom-built information systems for the registration of scholarly books, as well as weighting and funding allocation procedures. For the first time, these systems make the assessment of books as a research output feasible. The present paper summarizes the main features of the registration and/or assessment systems developed in five European countries/regions (Spain, Denmark, Flanders, Finland and Norway), focusing on the processes involved in the collection and processing of data on book publications, their weighting, as well as the application in the context of research assessment and funding.


Book assessment models CRIS Funding allocation Book publishers Social Sciences and Humanities Indicators Monographs 

JEL Classification


Mathematics Subject Classification




This research is partially the result of the project ‘Evaluación de editoriales científicas (españolas y extranjeras) de libros en Ciencias Humanas y Sociales a través de la opinión de los expertos y del análisis de los procesos HAR2011-30383-C02-01 (Ministerio de Economía y Competitividad. Plan Nacional de I+D+I). F.V. and T.E. thank the Flemish government for financial support through the Centre for R&D Monitoring (ECOOM).


  1. Aagaard, K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42(5), 725–737.CrossRefGoogle Scholar
  2. Aagaard, K., Bloch, C., & Schneider, J. W. (2015). Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator. Research Evaluation, 24(2), 106–117.CrossRefGoogle Scholar
  3. Aagaard, K., Bloch, C. W., Schneider, J. W., Henriksen, D., Ryan, T. K., & Lauridsen, P. S. (2014). Evaluering af den norske publiceringsindikator: Dansk Center for Forskningsanalyse, Aarhus Universitet.
  4. Adams, J., & Testa, J. (2011). Thomson Reuters Book Citation Index. In E. Noyons, P. Ngulube, & J. Leta (Eds.), The 13th conference of the international society for scientometrics and informetrics (pp. 13–18). Durban: ISSI, Leiden University and University of Zululand.Google Scholar
  5. Auranen, O., & Pölönen, J. (2012). Classification of scientific publication channels: Final report of the Publication Forum project (2010–2012). Federation of Finnish Learned Societies.
  6. Bibliographic Indicator. Viewed 19.01.2016.
  7. Biglu, M. H. (2008). The influence of references per paper in the SCI to Impact Factors and the Matthew Effect. Scientometrics, 74(3), 453–470.CrossRefGoogle Scholar
  8. Bloch, C., & Schneider, J. W. (2016). Performance-based funding models and researcher behavior: An analysis of the influence of the Norwegian Publication Indicator at the individual level. Research Evaluation. doi: 10.1093/reseval/rvv047.Google Scholar
  9. Bonitz, M., Bruckner, E., & Scharnhorst, A. (1997). Characteristics and impact of the Matthew effect for countries. Scientometrics, 40(3), 407–422.CrossRefGoogle Scholar
  10. Cronin, B., & La Barre, K. (2004). Mickey Mouse and Milton: Book publishing in the humanities. Learned Publishing, 17(2), 85–98.CrossRefGoogle Scholar
  11. Cullars, J. (1992). Citation characteristics of monographs in the fine arts. Library Quarterly, 62, 325–342.CrossRefGoogle Scholar
  12. Cullars, J. M. (1998). Citation characteristics of English-language monographs in philosophy. Library & Information Science Research, 20, 41–68.CrossRefGoogle Scholar
  13. Debackere, K., & Glänzel, W. (2004). Using a bibliometric approach to support research policy making: The case of the Flemish BOF-key. Scientometrics, 59(2), 253–276.CrossRefGoogle Scholar
  14. Deepika, J., & Mahalakshmi, G. S. (2011). Journal impact factor: A measure of quality or popularity? In IICAI (pp. 1138–1157).Google Scholar
  15. Engels, T. C. E., Ossenblok, T. L. B., & Spruyt, E. H. J. (2012). Changing publication patterns in the Social Sciences and Humanities, 2000–2009. Scientometrics, 93, 373–390.CrossRefGoogle Scholar
  16. Errami, M., Sun, Z., Long, T. C., George, A. C., & Garner, H. R. (2009). Deja vu: A database of highly similar citations in the scientific literature. Nucleic Acids Research, 37(suppl 1), D921–D924.CrossRefGoogle Scholar
  17. Falagas, M. E., Kouranos, V. D., Arencibia-Jorge, R., & Karageorgopoulos, D. E. (2008). Comparison of SCImago journal rank indicator with journal impact factor. The FASEB Journal, 22(8), 2623–2628.CrossRefGoogle Scholar
  18. Frølich, N. (2011). Multi-layered accountability. Performance-based funding of universities. Public Administration, 89(3), 840–859.CrossRefGoogle Scholar
  19. Garfield, E. (1996). How can impact factors be improved? British Medical Journal, 313(7054), 411.CrossRefGoogle Scholar
  20. Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA, 295(1), 90–93.CrossRefGoogle Scholar
  21. Giménez-Toledo, E., Mañana-Rodríguez, J., Engels, T., Ingwersen, P., Polonen, J., Sivertsen, G., et al. (2015a). The evaluation of scholarly books as research output. Current developments in Europe. In Proceedings of the 15th conference on scientometrics & informetrics. Istambul: ISSI, 2015.Google Scholar
  22. Giménez-Toledo, E., Mañana-Rodríguez, J., & Tejada-Artigas, C. M. (2015b). Scholarly publishers indicators: Prestige, specialization and review systems of scholarly book publishers. El profesional de la información, 24(6), 855–860.
  23. Giménez-Toledo, E., & Román-Román, A. (2009). Assessment of humanities and social sciences monographs through their publishers: A review and a study towards a model of evaluation. Research Evaluation, 18(3), 201–213.CrossRefGoogle Scholar
  24. Giménez-Toledo, E., Tejada-Artigas, C., & Mañana-Rodríguez, J. (2013). Evaluation of scientific books’ publishers in social sciences and humanities: Results of a survey. Research Evaluation, 22(1), 64–77. doi: 10.1093/reseval/rvs036.CrossRefGoogle Scholar
  25. Glänzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information Processing and Management, 35(1), 31–44.CrossRefGoogle Scholar
  26. Gorraiz, J., Purnell, P. J., & Glänzel, W. (2013). Opportunities for and limitations of the book citation index. Journal of the American Society for Information Science and Technology, 64(7), 1388–1398.CrossRefGoogle Scholar
  27. Hartley, J. (2006). Reading and writing book reviews across disciplines. Journal of the American Society for Information Science and Technology, 57(9), 1194–1207.CrossRefGoogle Scholar
  28. Heinzkill, R. (1980). Characteristics of references in selected scholarly. English journals. Library Quarterly, 50, 352–365.CrossRefGoogle Scholar
  29. Hicks, D. (2004). The four literatures of social science. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research: The use of publication and patent statistics in studies of S&T systems (pp. 473–496). Dordrecht: Kluwer.Google Scholar
  30. Hicks, D. (2012a). Performance-based university research funding systems. Research Policy, 41(2), 251–261.MathSciNetCrossRefGoogle Scholar
  31. Hicks, D. (2012b). One size doesn’t fit all: On the co-evolution of national evaluation systems and social science publishing. Confero: Essays on Education, Philosophy and Politics, 1(1), 67–90.CrossRefGoogle Scholar
  32. Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520, 429–431.CrossRefGoogle Scholar
  33. Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of the American Society for Information Science and Technology, 59(11), 1819–1828.MathSciNetCrossRefGoogle Scholar
  34. Ingwersen, P., & Larsen, B. (2014). Influence of a performance indicator on Danish research production and citation impact 2000–12. Scientometrics. doi: 10.1007/s11192-014-1291-x.Google Scholar
  35. Kousha, K., Thelwall, M., & Rezaie, S. (2011). Assessing the citation impact of books: The role of Google Books, Google Scholar, and Scopus. Journal of the American Society for Information Science and Technology, 62(11), 2147–2164.CrossRefGoogle Scholar
  36. Larivière, V., & Gingras, Y. (2010). The impact factor’s Matthew Effect: A natural experiment in bibliometrics. Journal of the American Society for Information Science and Technology, 61(2), 424–427.Google Scholar
  37. Leydesdorff, L., & Felt, U. (2012). Edited volumes, monographs and book chapters in the Book Citation Index (BKCI) and Science Citation Index (SCI, SoSCI, A&HCI). Journal of Scientometric Research, 1(1), 28–34.CrossRefGoogle Scholar
  38. Michavila, F. (dir.). (2012). La Universidad española en cifras. Madrid: CRUE.
  39. Ministry of Education and Culture. (2014). Greater incentives for strengthening quality in education and research: A proposal for revising the funding model for universities as of 2015.
  40. Ministry of Education and Culture. (2015). Proposal for the funding model of universities as of 2017.
  41. Moed, H. (2005). Citation analysis in research evaluation. New York: Springer.Google Scholar
  42. Oppenheim, C., & Summers, M. A. (2008). Citation counts and the Research Assessment Exercise, part VI: Unit of assessment 67 (music). Information Research: An International Electronic Journal, 13(2).
  43. Ossenblok, T. L., Engels, T. C., & Sivertsen, G. (2012). The representation of the social sciences and humanities in the Web of Science—a comparison of publication patterns and incentive structures in Flanders and Norway (2005–9). Research Evaluation, 21(4), 280–290.CrossRefGoogle Scholar
  44. Pölönen, J., & Ruth, A.-S. (2015). Final report on 2014 review of ratings in Publication Forum, Federation of Finnish Learned Societies 2015.
  45. Puuska, H.-M. (2014). Scholarly Publishing Patterns in Finland: A comparison of disciplinary groups. Tampere: Tampere University Press.Google Scholar
  46. Sharp, S. (2004). The Research Assessment Exercises 1992–2001: Patterns across time and subjects. Studies in Higher Education, 29(2), 201–218.CrossRefGoogle Scholar
  47. Sharp, S., & Coleman, S. (2005). Ratings in the Research Assessment Exercise 2001—The patterns of university status and panel membership. Higher Education Quarterly, 59(2), 153–171.CrossRefGoogle Scholar
  48. Siegel, D., & Baveye, P. (2010). Battling the paper glut. Science, 329(5998), 1466.CrossRefGoogle Scholar
  49. Sivertsen, G. (2010). A performance indicator based on complete data for the scientific publication output at research institutions. ISSI Newsletter, 6(1), 22–28.Google Scholar
  50. Sivertsen, G., & Larsen, B. (2012). Comprehensive bibliographic coverage of the social sciences and humanities in a citation index: An empirical analysis of the potential. Scientometrics, 91(2), 567–575.CrossRefGoogle Scholar
  51. Sivertsen, G., & Schneider, J. W. (2012). Evaluering av den bibliometriske forskningsindikator. Oslo: NIFU.Google Scholar
  52. Stern, M. (1983). Characteristics of the literature of literary scholarship. College and Research Libraries, 44, 199–209.CrossRefGoogle Scholar
  53. Taylor, J. (2011). The assessment of research quality in UK universities: Peer review or metrics? British Journal of Management, 22(2), 202–217.CrossRefGoogle Scholar
  54. Technopolis Group. (2013). Evaluation of the Flemish Academic Bibliographic Database for the Social Sciences and Humanities (VABB-SHW). Executive summary. Amsterdam: Technopolis Group.Google Scholar
  55. Thompson, J. W. (2002). The death of the scholarly monograph in the humanities? Citation patterns in literary scholarship. Libri, 52, 121–136.CrossRefGoogle Scholar
  56. Torres-Salinas, D., & Moed, H. F. (2009). Library Catalog Analysis as a tool in studies of social sciences and humanities: An exploratory study of published book titles in Economics. Journal of Informetrics, 3(1), 9–26.CrossRefGoogle Scholar
  57. Torres-Salinas, D., Robinson-García, N., Fuente-Gutierrez, E., & Jiménez-Contreras, E. (2014). Bibliometric indicators for publishers.
  58. UNE. (2014). España crea un sello de calidad para reconocer la excelencia científica del proceso editorial de las colecciones publicadas por las universidades.
  59. Verleysen, F. T., & Engels, T. C. E. (2013). A label for peer-reviewed books. Journal of the American Society for Information Science and Technology, 64, 428–430.CrossRefGoogle Scholar
  60. Verleysen, F. T., Ghesquière, P., & Engels, T. C. E. (2014). The objectives, design and selection process of the Flemish Academic Bibliographic Database for the Social Sciences and Humanities (VABB-SHW). In W. Blockmans, et al. (Eds.), The use and abuse of bibliometrics (pp. 115–125). London: Academiae Europaea; Portland Press.Google Scholar
  61. White, H. D., Boell, S. K., Yu, H., Davis, M., Wilson, C. S., & Cole, F. T. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096.CrossRefGoogle Scholar
  62. Zuccala, A., Guns, R., Cornacchia, R., & Bod, R. (2014). Can we rank scholarly book publishers? A bibliometric experiment with the field of history. Journal of the Association for Information Science and Technology, 66(7), 1333–1347.CrossRefGoogle Scholar
  63. Zuccala, A., & van Leeuwen, T. (2011). Book reviews in humanities research evaluations. Journal of the Association for Information Science and Technology, 62, 1979–1991. doi: 10.1002/asi.21588.CrossRefGoogle Scholar
  64. Zuccala, A. A., Verleysen, F. T., Cornacchia, R., & Engels, T. C. (2015). Altmetrics for the humanities: Comparing Goodreads reader ratings with citations to history books. Aslib Journal of Information Management, 67(3), 320–336.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2016

Authors and Affiliations

  • Elea Giménez-Toledo
    • 1
  • Jorge Mañana-Rodríguez
    • 1
  • Tim C. E. Engels
    • 2
    • 3
  • Peter Ingwersen
    • 4
  • Janne Pölönen
    • 5
  • Gunnar Sivertsen
    • 6
  • Frederik T. Verleysen
    • 2
  • Alesia A. Zuccala
    • 4
  1. 1.Centro de Ciencias Humanas y SocialesÍLIA Research Group, CSICMadridSpain
  2. 2.Centre for R&D Monitoring (ECOOM), Faculty of Social SciencesUniversity of AntwerpAntwerpBelgium
  3. 3.Antwerp Maritime AcademyAntwerpBelgium
  4. 4.Royal School of Library and Information ScienceUniversity of CopenhagenCopenhagenDenmark
  5. 5.Publication ForumFederation of Finnish Learned SocietiesHelsinkiFinland
  6. 6.Nordic Institute for Studies in Innovation, Research and Education (NIFU)Tøyen, OsloNorway

Personalised recommendations