Quantity versus impact of software engineering papers: a quantitative study
- 369 Downloads
According to the data from the Scopus publication database, as analyzed in several recent studies, more than 70,000 papers have been published in the area of Software Engineering (SE) since late 1960’s. According to our recent work, 43% of those papers have received no citations at all. Since citations are the most commonly used metric for measuring research (academic) impact, these figures raise questions (doubts) about the (non-existing) impact of such a large set of papers. It is a reality that typical academic reward systems encourage researchers to publish more papers and do not place a major emphasis on research impact. To shed light on the issue of volume (quantity) versus citation-based impact of SE research papers, we conduct and report in this paper a quantitative bibliometrics assessment in four aspects: (1) quantity versus impact of different paper types (e.g., conference versus journal papers), (2) ratios of uncited (non-impactful) papers, (3) quantity versus impact of papers originating from different countries, and (4) quantity versus impact of papers by each of the top-10 authors (in terms of number of papers). To achieve the above objective, we conducted a quantitative exploratory bibliometrics assessment, comprised of four research questions, to assess quantity versus impact of SE papers with respect to the aspects discussed above. We extracted the data through a systematic, automated and repeatable process from the Scopus paper database, which we also used in two previous papers. Our results show that the distribution of SE publications has a major inequality in terms of impact overall, and also when categorized in terms of the above four aspects. The situation in the SE literature is similar to the other areas of science as studied by previous bibliometrics studies. Also, among our results is the fact that journal articles and conference papers have been cited 12.6 and 3.6 times on average, confirming the expectation that journal articles have more impact, in general, than conference papers. Also, papers originated from English-speaking countries have in general more visibility and impact (and consequently citations) when compared to papers originated from non-English-speaking countries. Our results have implications for improvement of academic reward systems, which nowadays mainly encourage researchers to publish more papers and usually neglect research impact. Also, our results can help researchers in non-English-speaking countries to consider improvements to increase their research impact of their upcoming papers.
KeywordsBibliometrics Software engineering Research impact Countries Authors Exploratory study
Vahid Garousi was partially supported by several internal grants provided by the Hacettepe University and the Scientific and Technological Research Council of Turkey (TÜBİTAK). João M. Fernandes was supported by FCT - Fundação para a Ciência e Tecnologia within the Project Scope UID/CEC/00319/2013.
- Altmetric LLP, “Bookmarklet for Researchers,” in https://www.altmetric.com/products/free-tools/bookmarklet/. Last Accessed May 2017.
- Basili, V. R. (1992) Software modeling and measurement: The Goal/Question/Metric paradigm. In: Technical Report, University of Maryland at College Park.Google Scholar
- Biswas, A. K. and Kirchherr, J. (2016) Prof, no one is reading you. http://www.straitstimes.com/opinion/prof-no-one-is-reading-you. Last Accessed May 2017.
- Cole, S., Cole, J. R., & Dietrich, L. (1978). Measuring the cognitive state of scientific disciplines. In Y. Elkana, J. Lederberg, R. K. Merton, A. Thackray, & H. Zuckerman (Eds.), Toward a Metric of Science: The Advent of Science Indicators (pp. 209–251). Hoboken: Wiley.Google Scholar
- Easterbrook, S., Singer, J., Storey, M.-A., & Damian, D. (2008). Selecting empirical methods for software engineering research. In F. Shull, J. Singer, & D. K. Sjøberg (Eds.), Guide to advanced empirical software engineering (pp. 285–311). London: Springer.Google Scholar
- Garousi, V. and Fernandes, J. M. (2016) All source data for quantity versus quality: A bibliometric assessment of the number versus impact of software engineering research papers. https://goo.gl/JYgRQB. Last Accessed May 2017.
- Garousi, V., and Mäntylä, M. V. (2015) All source data for bibliometrics study of the software engineering community in https://goo.gl/G8f0M0. Last Accessed May 2017.
- Garousi, V., & Varma, T. (2010). A bibliometric assessment of canadian software engineering scholars and institutions (1996–2006). Canadian Journal on Computer and Information Science, 3, 19–29.Google Scholar
- Grigore, R. (2007) David Parnas is wrong!. http://rgrig.blogspot.com.tr/2007/10/david-parnas-is-wrong.html. Last Accessed May 2017.
- Latour, B., & Woolgar, S. (1979). Laboratory Life: The Construction of Scientific Facts. Los Angeles: Sage.Google Scholar
- LibGuides at Duke University Medical Center, “Altmetrics: a primer,” in http://guides.mclibrary.duke.edu/altmetrics/home. Last Accessed May 2017.
- Lotka, A. J. (1926). The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16, 317–323.Google Scholar
- Moed, H. F. (2006). Citation analysis in research evaluation. Berlin: Springer.Google Scholar
- Moed, H. F. and Visser, M. S. (2007) Developing bibliometric indicators of research performance in computer science: An exploratory study. In: Research Report, Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands.Google Scholar
- Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493, 159.Google Scholar
- Priem, J., Taraborelli, D., Groth, P, Neylon C. (2016) Altmetrics: A manifesto, in http://altmetrics.org/manifesto, 2010. Last Accessed May 2017.
- Pyšek, P., Richardson, D. M., & Jarošík, V. (2006). Who cites who in the invasion zoo: insights from an analysis of the most highly cited papers in invasion ecology. Preslia, 78, 437–468.Google Scholar
- Rahm, E. (2008). Comparing the scientific impact of conference and journal publications in computer science. Inf. Serv. Use, 28, 127–128.Google Scholar
- Reuters, T. (2016) Highly cited researchers. http://hcr.stateofinnovation.thomsonreuters.com/. Last Accessed May 2017.
- Saha, S., Saint, S., & Christakis, D. A. (2003). Impact factor: A valid measure of journal quality? Journal of the Medical Library Association, 91, 42–61.Google Scholar
- Schliesser, E. (2016) On ‘Me 2 research’ and English-as-Second-language citation rates. http://digressionsnimpressions.typepad.com/digressionsimpressions/2014/10/on-me-2-research-and-non-esl-citation-rates.html. Last Accessed May 2017.
- Tongai, I. (2013) Incentives for researchers drive up publication output. http://www.universityworldnews.com/article.php?story=20130712145949477. Last Accessed May 2017.
- Vardi, M. Y. (2009). Conferences vs. journals in computing research. Communication of the ACM, 52, 5.Google Scholar
- Various authors. (2016). Why is it said that judging a paper by citation count is a bad idea?. http://academia.stackexchange.com/questions/37021/why-is-it-said-that-judging-a-paper-by-citation-count-is-a-bad-idea. Last Accessed May 2017.
- West, R., & Stenius, K. (2009). Use and abuse of citations. In T. F. Babor, K. Stenius, S. Savva, & J. O’Reill (Eds.), Publishing addiction science: A guide for the perplexed. Multi-Science Publishing Co.Google Scholar