Abstract
Scientific production has increased considerably in the last two decades in Brazil, placing the country as an emerging scientific power. In the field of Education, the same can be observed, with a significant growth in the number of articles published by Brazilian researchers in the recent past. In this article we evaluate Brazilian research output in Education from 2007 to 2016, with the intent to understand how national evaluation standards compare with international parameters dictated by bibliometric indicators. We confront the citation impact of Brazilian publications in the period with their classification in an expert-based journal evaluation system called QUALIS. We used Scopus’ SNIP bibliometric indicator for this analysis. The study was carried out using data about 40,825 articles published in 2719 different journals. Results showed that only a small percentage of these articles featured in Scopus indexed journals (13.28%), and most of these journals were published in Brazil (66%). The citation impact of the Scopus indexed publications had a significant growth in the period, but journals with dissimilar citation impact were not distinctively distributed in separate QUALIS categories. These findings show a certain publishing pattern that is likely to be related to the association of the Brazilian research evaluation and funding systems. In addition, they raise questions about how the establishment of evaluation criteria that is mainly subjective and does not include specific metrics may hinder the visibility of research output from a global perspective.
Similar content being viewed by others
Notes
As researchers occasionally publish in journals in knowledge fields other than their own, the dataset also included journals that were not specifically targeted to the Education community.
References
Abramo, G., & D’Angelo, C. A. (2011). Evaluating research: from informed peer review to bibliometrics. Scientometrics, 87(3), 499–514. https://doi.org/10.1007/s11192-011-0352-7.
Appel, C. (2008). European reference index for the humanities (ERIH) and metrics. Editors’ Bulletin, 4(1), 3–5. https://doi.org/10.1080/17521740802025366.
Archambault, É., & Larivière, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635–649. https://doi.org/10.1007/s11192-007-2036-x.
Barata, R. C. B. (2016). Dez coisas que você deveria saber sobre o QUALIS [Ten things you should know about the QUALIS]. RBPG. Revista Brasileira de Pós-Graduação, 13(30), 13–40. https://doi.org/10.21713/2358-2332.2016.v13.947.
Bedeian, A. G. (2003). The manuscript review process: The proper roles of authors, referees, and editors. Journal of Management Inquiry, 12(4), 331–338. https://doi.org/10.1177/1056492603258974.
Belter, C. W. (2015). Bibliometric indicators: Opportunities and limits. Journal of the Medical Library Association: JMLA, 103(4), 219. https://doi.org/10.3163/1536-5050.103.4.014.
Bordons, M., Fernández, M., & Gómez, I. (2002). Advantages and limitations in the use of impact factor measures for the assessment of research performance. Scientometrics, 53(2), 195–206. https://doi.org/10.1023/A:1014800407876.
CAPES. (2020). Funcação CAPES: Sobre as áreas de avaliação. Retrieved from https://www.capes.gov.br/avaliacao/sobre-as-areas-de-avaliacao.
Chavarro, D., Tang, P., & Ràfols, I. (2017). Why researchers publish in non-mainstream journals: Training, knowledge bridging, and gap filling. Research Policy, 46(9), 1666–1680. https://doi.org/10.1016/j.respol.2017.08.002.
Chuang, Y.-W., Lee, L.-C., Hung, W.-C., & Lin, P.-H. (2010). Forging into the innovation lead: A comparative analysis of scientific capacity. International Journal of Innovation Management, 14(03), 511–529. https://doi.org/10.1142/s1363919610002763.
Clarivate Analytics. (2018). Research in Brazil: Funding excellence: Analysis prepared on behalf of CAPES by the Web of Science Group. Retrieved from https://jornal.usp.br/wp-content/uploads/2019/09/ClarivateReport_2013-2018.pdf.
Codina, L. (2016). Evaluación de la ciencia: tan necesaria como problemática. El profesional de la información (EPI), 25(5), 715–719.
Collazo-Reyes, F. (2014). Growth of the number of indexed journals of Latin America and the Caribbean: the effect on the impact of each country. Scientometrics, 98(1), 197–209. https://doi.org/10.1007/s11192-013-1036-2.
Curry, S. (2018). Let’s move beyond the rhetoric: it’s time to change how we judge research. Nature, 554, 147. https://doi.org/10.1038/d41586-018-01642-w.
Day, T. E. (2015). The big consequences of small biases: A simulation of peer review. Research Policy, 44(6), 1266–1270. https://doi.org/10.1016/j.respol.2015.01.006.
de Almeida, E. C. E., & Guimarães, J. A. (2013). Brazil’s growing production of scientific articles—How are we doing with review articles and other qualitative indicators? Scientometrics, 97(2), 287–315. https://doi.org/10.1007/s11192-013-0967-y.
DORA. (2012). San Francisco declaration on research assessment. Retrieved from https://sfdora.org/read/.
Elsevier. (2020). Scopus, May, 15. Retrieved from https://www.elsevier.com/solutions/scopus/how-scopus-works/metrics.
Gallo, S. A., Sullivan, J. H., & Glisson, S. R. (2016). The influence of peer reviewer expertise on the evaluation of research funding applications. PLoS ONE, 11(10), e0165147. https://doi.org/10.1371/journal.pone.0165147.
Gans, J. S., & Shepherd, G. B. (1994). How are the mighty fallen: Rejected classic articles by leading economists. Journal of Economic Perspectives, 8(1), 165–179.
Garfield, E. (1998). The impact factor and using it correctly. Der Unfallchirurg, 48(2), 413.
Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304. https://doi.org/10.1023/B:MINE.0000005155.70870.bd.
Gibbs, W. W. (1995). Lost science in the third world. Scientific American, 273(2), 92–99. https://doi.org/10.1038/scientificamerican0895-92.
Giménez-Toledo, E., Mañana-Rodríguez, J., & Delgado-Lopez-Cozar, E. (2013). Quality indicators for scientific journals based on experts opinion. https://arxiv.org/abs/1307.1271.
Gingras, Y. (2016). Bibliometrics and research evaluation: Uses and abuses. Cambridge: MIT Press.
Gogolin, I. (2016). European educational research quality indicators (EERQI): An experiment. In: M. Ochsner, S. Hug, & H. D. Daniel (Eds.), Research assessment in the humanities (pp. 103–111). Springer, Cham. https://doi.org/10.1007/978-3-319-29016-4_9.
Gorman, G. E. (2008). They can’t read, but they sure can count: Flawed rules of the journal rankings game. Online Information Review, 32(6), 705–708. https://doi.org/10.1108/14684520810923872.
Haddow, G. (2008). Quality Australian journals in the humanities and social sciences. Australian Academic and Research Libraries, 39(2), 79–91. https://doi.org/10.1108/14684520810923872.
Helene, A. F., & Ribeiro, P. L. (2011). Brazilian scientific production, financial support, established investigators and doctoral graduates. Scientometrics, 89(2), 677. https://doi.org/10.1007/s11192-011-0470-2.
Hickes, D., & Wang, J. (2011). Coverage and overlap of the new social sciences and humanities journal lists. Journal of the American Society for Information Science and Technology, 62(2), 284–294. https://doi.org/10.1002/asi.21458.
Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafois, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429–431. https://doi.org/10.1038/520429a.
ISSN. (2020). International Standard Serial Number International Center. Retrieved from https://www.issn.org/.
King, C. (2009). Brazilian science on the rise. ScienceWatch: Tracking trends and performance in basic research, July/August. Retrieved from http://archive.sciencewatch.com/ana/fea/09julaugFea/.
Larivière, V., & Macaluso, B. (2011). Improving the coverage of social science and humanities researchers’ output: The case of the Érudit journal platform. Journal of the American Society for Information Science and Technology, 62(12), 2437–2442. https://doi.org/10.1002/asi.21632.
Mugnaini, R. (2018). Output and impact of Brazilian research: confronting international and national contexts. SciELO in perspective. Retrieved from https://blog.scielo.org/en/2018/08/27/output-and-impact-of-brazilian-research-confronting-international-and-national-contexts/.
Noorden, R. V. (2014). The impact gap: South America by the numbers. Nature, 510, 202–203. https://doi.org/10.1038/510202a.
Ochsner, M., Hug, S. E., & Daniel, H. D. (2016) Research assessment in the humanities: Introduction. In: M. Ochsner, S. Hug, & H. D. Daniel (Eds.), Research assessment in the humanities (pp. 1–10). Springer, Cham. https://doi.org/10.1007/978-3-319-29016-4_1.
Ossenblok, T. L. B., Engels, T. C. E., & Sivertsen, G. (2012). The representation of the social sciences and humanities in the Web of Science—A comparison of publication patterns and incentive structures in Flanders and Norway (2005–9). Research Evaluation, 21(4), 280–290. https://doi.org/10.1093/reseval/rvs019.
Powell, K. (2012). Publishing: Foreign tongues. Nature, 487, 129–131. https://doi.org/10.1038/nj7405-129a.
Reale, E., Barbara, A., & Costantini, A. (2007). Peer review for the evaluation of academic research: Lessons from the Italian experience. Research evaluation, 16(3), 216–228. https://doi.org/10.3152/095820207X227501.
Reategui, E., & Pires, A. (2020). Brazilian research output in education from 2007 to 2016. Mendeley Data. http://dx.doi.org/10.17632/f3j5dptdrv.1.
Rousseau, R. (2002). Journal evaluation: Technical and practical issues. Library Trends, 50(3), 418–439.
Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314, 498–502. https://doi.org/10.1136/bmj.314.7079.497.
Simons, K. (2008). The misused impact factor. Science, 32(5899), 165. https://doi.org/10.1126/science.1165316.
Sivertsen, G. (2016). Patterns of internationalization and criteria for research assessment in the social sciences and humanities. Scientometrics, 107(2), 357–368. https://doi.org/10.1007/s11192-016-1845-1.
Toledo, E. G. (2016). Assessment of journal and book publishers in the humanities and social sciences in Spain. In: M. Ochsner, S. Hug, & H. D. Daniel (Eds.), Research assessment in the humanities. Springer, Cham. https://doi.org/10.1007/978-3-319-29016-4_8.
UCL. (2019). UCL Institute of Education remains first for Education in QS subject rankings, Institute of Education, February 27. Retrieved from https://www.ucl.ac.uk/ioe/news/2019/feb/ucl-institute-education-retains-world-number-one-position-education-qs-subject-rankings.
Vanclay, J. K. (2012). Impact factor: outdated artefact or stepping-stone to journal certification? Scientometrics, 92(2), 211–238. https://doi.org/10.1007/s11192-011-0561-0.
Verhine, R. E., & Dantas, L. M. (2012). Reflexões sobre o sistema de avaliação da capes a partir do V Plano Nacional de Pós-graduação [Reflections on the CAPES Evaluation System from the perspective of the V National Plan for Graduate Education]. Revista de Educação Pública, 18(37), 295–310.
Waltman, L., Van Eck, N. J., Leeuwen, T. N., & Visser, M. S. (2013). Some modifications to the SNIP journal impact indicator. Journal of Informetrics, 7(2), 272–285. https://doi.org/10.1016/j.joi.2012.11.011.
Whitley, R. (2007). Changing governance of the public sciences. In: R. Whitley, & J. Gläser (Eds.), The changing governance of the sciences. Sociology of the sciences yearbook (Vol. 26). Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-6746-4_1.
Wilsdon, J., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. https://doi.org/10.13140/RG.2.1.4929.1363.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Reategui, E., Pires, A., Carniato, M. et al. Evaluation of Brazilian research output in education: confronting international and national contexts. Scientometrics 125, 427–444 (2020). https://doi.org/10.1007/s11192-020-03617-z
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-020-03617-z