, Volume 112, Issue 1, pp 607–653 | Cite as

Do social sciences and humanities behave like life and hard sciences?

  • Andrea Bonaccorsi
  • Cinzia Daraio
  • Stefano Fantoni
  • Viola Folli
  • Marco Leonetti
  • Giancarlo Ruocco


The quantitative evaluation of Social Science and Humanities (SSH) and the investigation of the existing similarities between SSH and Life and Hard Sciences (LHS) represent the forefront of scientometrics research. We analyse the scientific production of the universe of Italian academic scholars , over a 10-year period across 2002–2012, from a national database built by the Italian National Agency for the Evaluation of Universities and Research Institutes. We demonstrate that all Italian scholars of SSH and LHS are equals, as far as their publishing habits. They share the same general law, which is a lognormal. At the same time, however, they are different, because we measured their scientific production with different indicators required by the Italian law; we eliminated the “silent” scholars and obtained different scaling values—proxy of their productivity rates. Our findings may be useful to further develop indirect quali–quantitative comparative analysis across heterogeneous disciplines and, more broadly, to investigate on the generative mechanisms behind the observed empirical regularities.


Evaluation Bibliometrics Social sciences and humanities Normalization Scaling Universality Italy 



We thank ANVUR for providing the data used in the study. We also thank Proff. D. Checchi, A. Graziosi and M. Schaerf for useful discussions and Dr. I. Bongioanni for preliminary data analysis.


  1. Archambault, É., Vignola-Gagné, É., Cǒté, G., Lariviere, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.CrossRefGoogle Scholar
  2. Ardanuy, J., Urbano, C., & Quintana, L. (2009). A citation analysis of Catalan literary studies (1974–2003): Towards a bibliometrics of humanities studies in minority languages. Scientometrics, 81(2), 347–366.CrossRefGoogle Scholar
  3. Cartlidge, E. (2010). Italian Parliament passes Controversial University reforms. Science, 330, 1462–1463.CrossRefGoogle Scholar
  4. Daraio, C., & Moed, H. F. (2011). Is Italian science declining? Research Policy, 40(10), 1380–1392.CrossRefGoogle Scholar
  5. Deville, P., Wang, D., Sinatra, R., Song, C., Blondel, V. D., & Barabsi, A. L. (2014). Career on the move: geography, stratification, and scientific impact. Scientific Reports, 4(4770), 4770.Google Scholar
  6. Egghe, L., & Rousseau, R. (1990). Introduction to informetrics. Quantitative methods in library, documentation and information science. Amsterdam: Elsevier.Google Scholar
  7. Egghe, L., & Rousseau, R. (1996). Stochastic processes determined by a general success-breeds-success principle. Mathematical and Computer Modelling, 23(4), 93–104.MathSciNetCrossRefzbMATHGoogle Scholar
  8. Evans, T. S., Hopkins, N., & Kaube, B. S. (2012). Universality of performance indicators based on citation and reference counts. Scientometrics, 93, 473–495.CrossRefGoogle Scholar
  9. Fanelli, D., & Glänzel, W. (2013). Bibliometric evidence for a hierarchy of the sciences. PLoS ONE, 8(6), e66938.CrossRefGoogle Scholar
  10. Ferrara, A., & Bonaccorsi, A. (2016). How robust is journal ratingin Humanities and Social Science? Evidence from a large-scale,multi-method exercise. Research Evaluation, February 2016. dOI: 10.1093/reseval/rvv048.
  11. Giménez-Toledo, E., Mañana-Rodríguez, J., Engels, T. C., Ingwersen, P., Pölönen, J., Sivertsen, G., et al. (2016). Taking scholarly books into account: current developments in five European countries. Scientometrics, 107(2), 685–699.CrossRefGoogle Scholar
  12. Guimera, R., Uzzi, B., Spiro, J., & Amaral, L. A. N. (2005). Team assembly mechanisms determine collaboration network structure and team performance. Science, 308(5722), 697–702.CrossRefGoogle Scholar
  13. Hicks, D. (2004). The four literatures of social science. In H. Moed, W. Glanzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology studies (pp. 473–496). Dordrecht: Kluwer Academic Publishers.Google Scholar
  14. Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429–431.CrossRefGoogle Scholar
  15. Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of The American Society for Information Science and Technology, 59(11), 1819–1828.CrossRefGoogle Scholar
  16. Jaffe, K. (2014). Social and natural sciences differ in their research strategies, adapted to work for different knowledge landscapes. PloS ONE, 9(11), e113901.CrossRefGoogle Scholar
  17. Limpert, E., Stahel, W. A., & Abbt, M. (2001). Log-normal distributions across the sciences: Keys and clues. BioScience, 51(5), 341–352.CrossRefGoogle Scholar
  18. Linmans, A. J. M. (2010). Why with bibliometrics the Humanities does not need to be the weakest link. Indicators for Research Evaluation Based on Citations, Library Holdings, and Productivity Measures, Scientometrics, 83, 337–354.Google Scholar
  19. Lotka, A. J. (1926). The frequency distribution of scientic productivity. Journal of the Washington Academy of Sciences, 16, 317323.Google Scholar
  20. Martinez-Mekler, G., Martinez, R. A., del Rio, M. B., Mansilla, R., Miramontes, P., & Cocho, G. (2009). Universality of rank-ordering distributions in the arts and sciences. PLoS ONE, 4(3), e4791.CrossRefGoogle Scholar
  21. Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.CrossRefGoogle Scholar
  22. Moed, H. F., & Van Leeuwen, T. N. (1996). Impact factors can mislead. Nature, 381(6579), 186–186.CrossRefGoogle Scholar
  23. Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.Google Scholar
  24. Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the Association for Information Science and Technology, 65(8), 1627–1638.CrossRefGoogle Scholar
  25. Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66, 81–100.CrossRefGoogle Scholar
  26. Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.CrossRefGoogle Scholar
  27. Owens, B. (2013). Judgement day. Nature, 502, 288–290.CrossRefGoogle Scholar
  28. Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2007). Numerical recipes 3rd edition: The art of scientic computing. New York: Cambridge University Press.zbMATHGoogle Scholar
  29. Radicchi, F., Fortunato, S., & Castellano, C. (2008). Universality of citation distributions: Toward an objective measure of scientific impact. Proceedings of the National Academy of Sciences of the United States of America, 105, 17268–17272.CrossRefGoogle Scholar
  30. Rørstad, K., & Aksnes, D. W. (2015). Publication rate expressed by age, gender and academic positionA large-scale analysis of Norwegian academic staff. Journal of Informetrics, 9(2), 317–333.CrossRefGoogle Scholar
  31. Ruocco, G., & Daraio, C. (2013). An empirical approach to compare the performance of heterogeneous academic fields. Scientometrics, 97, 601–625.CrossRefGoogle Scholar
  32. Seglen, P. (1992). The skewness of science. Journal of the American Society for Information Science, 43, 628638.CrossRefGoogle Scholar
  33. Stringer, M. J., Sales-Pardo, M., & Amaral, L. A. N. (2008). Effectiveness of journal ranking schemes as a tool for locating information. PLoS ONE, 3(2), e1683.CrossRefGoogle Scholar
  34. Stringer, M. J., SalesPardo, M., & Amaral, L. A. N. (2010). Statistical validation of a global model for the distribution of the ultimate number of citations accrued by papers published in a scientific journal. Journal of the American Society for Information Science and Technology, 61(7), 1377–1385.CrossRefGoogle Scholar
  35. Torres-Salinas, D., & Moed, H. F. (2009). Library catalog analysis as a tool in studies of social sciences and humanities: An exploratory study of published book titles in economics. Journal of Informetrics, 3, 9–26.CrossRefGoogle Scholar
  36. Uzzi, B., & Spiro, J. (2005). Collaboration and creativity: The small world problem. American Journal of Sociology, 111(2), 447–504.CrossRefGoogle Scholar
  37. Uzzi, B., Mukherjee, S., Stringer, M., & Jones, B. (2013). Atypical combinations and scientific impact. Science, 342(6157), 468–472.CrossRefGoogle Scholar
  38. van Leeuwen, T. (2006). The application of bibliometric analyses in the evaluation of social science research. Who benefits from it, and why it is still feasible. Scientometrics, 66, 133–154.CrossRefGoogle Scholar
  39. van Raan, A. F. (2006). Performancerelated differences of bibliometric statistical properties of research groups: Cumulative advantages and hierarchically layered networks. Journal of the American Society for Information Science and Technology, 57(14), 1919–1935.CrossRefGoogle Scholar
  40. van Raan, A. F. J. (2008). Scaling rules in the science system: Influence of field-specific citation characteristics on the impact of research groups. Journal of the American Society for Information Science and Technology, 59(4), 565576.CrossRefGoogle Scholar
  41. Verleysen, F. T., & Weeren, A. (2016). Clustering by publication patterns of senior authors in the social sciences and humanities. Journal of Informetrics, 10(1), 254–272.CrossRefGoogle Scholar
  42. Waltman, L., van Eck, N. J., & van Raan, A. F. J. (2012). Universality of citation distributions revisited. Journal of the American Society for Information Science and Technology, 63(1), 72–77.CrossRefGoogle Scholar
  43. White, H. D., Boell, S. K., Yu, H., Davis, M., Wilson, C. S., & Cole, F. T. H. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096.CrossRefGoogle Scholar
  44. Whitley, R. (2000). The intellectual and social organization of the sciences. New York: Oxford University Press.Google Scholar
  45. Wuchty, S., Jones, B. F., & Uzzi, B. (2007). The increasing dominance of teams in production of knowledge. Science, 316(5827), 1036–1039.CrossRefGoogle Scholar
  46. Zuccala, A., & Cornacchia, R. (2016). Data matching, integration,and interoperability for a metric assessment of monographs. Scientometrics, 1–20.Google Scholar
  47. Zuccala, A. (2013). Evaluating the Humanities. Vitalizing ’the forgotten sciences’, Research Trends, n., 32, 3–6.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2017

Authors and Affiliations

  1. 1.1 DESTECUniversity of PisaPisaItaly
  2. 2.FBK-IRVAPP, Istituto per la Ricerca Valutativa sulle Politiche PubblicheTrentoItaly
  3. 3.Department of Computer, Control and Management Engineering Antonio Ruberti (DIAG)University of Rome ‘La Sapienza’RomeItaly
  4. 4.Fondazione Internazionale TriesteTriesteItaly
  5. 5.Center for Life Nano ScienceFondazione Istituto Italiano di Tecnologia (IIT)RomeItaly
  6. 6.Department of PhysicsUniversity of Rome ‘La Sapienza’RomeItaly
  7. 7.CNR NANOTEC-Institute of Nanotechnology c/o Campus EcotekneUniversity of SalentoLecceItaly

Personalised recommendations