Skip to main content

What difference does it make? Impact of peer-reviewed scholarships on scientific production

Abstract

We investigated the extent to which different selection mechanisms for awarding scholarships varied in their short- and longer-term consequences in the performance of awardees in terms of scientific production. We conducted an impact evaluation study on undergraduate, master’s, and PhD research scholarships and compared two different financial sources in Brazil: in one, the selection mechanism was based on a peer review system; the other was based on an institutional system other than peer review. Over 8,500 questionnaires were successfully completed, covering the period 1995–2009. The two groups were compared in terms of their scientific performance using a propensity score approach. We found that the peer-reviewed scholarship awardees showed better performance: they published more often and in journals with higher impact factors than scholarship awardees from the other group. However, two other results indicate a different situation. First, over the long-term, awardees under the peer review system continued to increase their publication rate and published in higher-quality journals; however, the differences with the control group tended to diminish after PhD graduation. Second, the better performance of peer-reviewed scholarships was not observed in all subject areas. The main policy implications of this study relate to a better understanding of selection mechanisms and the heterogeneity regarding the relation between selection processes and scientific and academic output.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Notes

  1. In that study, the difference observed between the groups needs to be regarded with care: the control group comprised both non-awardees who applied the same year as awardees and awardees in the same program but selected at a later stage than the treatment group.

  2. FAPESP is the largest state research agency in Brazil. It accounts for almost 40 % of the total financial support given annually to research activities in the state of São Paulo. São Paulo State accounts for one-third of Brazil’s GDP, one-fifth of the country’s population, and 45 % of its annual PhD graduates.

References

  • Abramo, G., D’Angelo, C. A., & Costa, F. (2010). Citations versus journal impact factor as proxy of quality: Could the latter ever be preferable? Scientometrics, 84, 821–833.

    Article  Google Scholar 

  • Asian Development Bank (ADB). (2007). Evaluation study: Japan scholarship program. Philippines: ABD.

    Google Scholar 

  • Amos, L. B., Windham, A. M., Reyes, I. B., Jones, W., & Baran, V. (2009). An impact evaluation of the gates millennium scholars program. Washington, DC: Bill & Melinda Gates Foundation.

    Google Scholar 

  • Arnold, E. (2012). Understanding long-term impacts of R&D funding: The EU framework programme. Research Evaluation, 21, 332–343.

    Article  Google Scholar 

  • Auriol, L., Misu, M., & Freeman, R. (2012). Doctoral graduates in times of economic downturn: labour market participation and mobility. Working Party of National Experts on Science and Technology Indicators. Directorate for Science, Technology and Industry-Committee for Scientific and Technological Policy: OECD.

    Google Scholar 

  • Bach, L. (2012). The frontiers of evaluation: Some considerations on the European case. Revista Brasileira de Inovação, 11, 67–84.

    Google Scholar 

  • Biotechnology and Biological Sciences Research Council (BBSRC). (2011). David Phillips fellowship scheme. Swindon: BBSRC.

    Google Scholar 

  • Bernal, J. D. (1954). Science in history. Cambridge: MIT Press.

    Google Scholar 

  • Böhmer, S., & von Ins, M. (2009). Different—Not just by label: Research-oriented academic careers in Germany. Research Evaluation, 18(3), 177–184.

    Article  Google Scholar 

  • Bornmann, L., Leydesdorff, L., & Besselaar, P. V. (2010). A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications. Journal of Informetrics, 4, 211–220.

    Article  Google Scholar 

  • Brody, S. (2013). Impact factor: Imperfect but not yet replaceable. Scientometrics, 96, 255–257.

    Article  Google Scholar 

  • Burgoine, T., Hopkins, P., Rech, M. F., & Zapata, G. P. (2011). ‘These kids can’t write abstracts’: Reflections on a postgraduate writing and publishing workshop. Area, 43(4), 463–469.

    Article  Google Scholar 

  • Bush, V. (1945). Science, the endless frontier: A report to the president on a program for postwar scientific research. Washington: United States Government Printing Office.

    Google Scholar 

  • CIDA. (2005). Evaluation of the Canadian Francophonie Scholarship Program (CFSP), 1987–2005. Canada: CIDA.

    Google Scholar 

  • Colugnati, F. A. B., Silva, A. M. A. C., SallesFilho, S. L. M. (2011). Multidimensional evaluation of a program for early-career researcher in Brazil—the young investigator in emerging centers program. In Atlanta Conference on Science and Innovation Policy. Atlanta: IEEE Conference Publications.

  • Edler, J., Georghiou, L., Blind, K., & Uyarra, E. (2012). Evaluating the demand side: New challenges for evaluation. Research Evaluation, 21, 33–47.

    Article  Google Scholar 

  • Feller, I. (2013). Peer review and expert panels as techniques for evaluating the quality of academic research. In A. N. Link & N. S. Vonortas (Eds.), Handbook on the Theory and Practice of Program Evaluation (pp. 115–142). Cheltenham: Edward Elgar.

    Chapter  Google Scholar 

  • Friedman, J. H. (2001). Greedy function approximation: A gradient boosting machine. Annals of Statistics, 29(5), 1189–1232.

    Article  MATH  MathSciNet  Google Scholar 

  • Garfield, E. (1999). Journal impact factor: A brief review. Journal of the Canadian Medical Association, 161(8), 979–980.

    Google Scholar 

  • Goldsmith, S. S., Presley, J. B., & Cooley, E. A. (2002). National Science Foundation Graduate Research Fellowship Program, Final Evaluation Report. Virginia: NSF.

    Google Scholar 

  • Halse, C., & Mowbray, S. (2011). The impact of the doctorate. Studies in Higher Education, 36(5), 513–525.

    Article  Google Scholar 

  • Heinze, T. (2008). How to sponsor ground-breaking research: a comparison of funding schemes. Science and Public Policy, 35(5), 302–318.

    Article  Google Scholar 

  • Hicks, D., & Melkers, J. (2013). Bibliometrics as a tool for research evaluation. In A. N. Link & N. S. Vonortas (Eds.), Handbook on the theory and practice of program evaluation (pp. 323–349). Cheltenham: Edward Elgar.

    Chapter  Google Scholar 

  • Jacob, B. A., & Lefgren, L. (2011). The impact of NIH postdoctoral training grants on scientific productivity. Research Policy, 40, 864–874.

    Article  Google Scholar 

  • Jarvey, P., Usher, A., & Mcelroy, L. (2012). Making research count: Analyzing canadian academic publishing cultures. Toronto: Higher Education Strategy Associates.

    Google Scholar 

  • Kamler, B. (2008). Rethinking doctoral publication practices: Writing from and beyond the thesis. Studies in Higher Education, 33(3), 283–294.

    Article  Google Scholar 

  • Kostoff, R., Averch, H., & Chubin, D. (1994). Research impact assessment: Introduction and overview. Evaluation Review, 18(1), 3–10.

    Article  Google Scholar 

  • Lee, H., Miozzo, M., & Laredo, P. (2010). Career patterns and competences of PhDs in science and engineering in the knowledge economy: The case of graduates from a UK research-based university. Research Policy, 39, 869–881.

    Article  Google Scholar 

  • Leydesdorff, L. (2012). Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the most-highly cited papers. Scientometrics, 92, 355–365.

    Article  Google Scholar 

  • Melin, G., & Danell, R. (2006). The top eight percent: Development of approved and rejected applicants for a prestigious grant in Sweden. Science and Public Policy, 33(10), 702–712.

    Article  Google Scholar 

  • Moral, F., & Pombo, N. (2011). Informe sociológico sobre la encuesta del valor social de las becas de la Fundación Carolina. Madrid: Fundación Carolina.

    Google Scholar 

  • Mowery, D. C., & Rosenberg, N. (1998). Paths of innovation: Technological change in 20th-century America. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Mutz, R., & Daniel, H.-D. (2012). The generalized propensity score methodology for estimating unbiased journal impact factors. Scientometrics, 92, 377–390.

    Article  Google Scholar 

  • Navarra-Madsen, J., Bales, R. A., & Hynds, D. L. (2010). Role of scholarships in improving success rates of undergraduate science, technology, engineering and mathematics (STEM) majors. Procedia Social and Behavioral Sciences, 8, 458–464.

    Article  Google Scholar 

  • Nelder, J., & Wedderburn, R. (1972). Generalized linear models. Journal of the Royal Statistical Society, 135(3), 370–384.

    Article  Google Scholar 

  • Netting, F. E., & Nichols-Casebolt, A. (1997). Authorship and collaboration. Journal of Social Work Education, 33(3), 555–564.

    Google Scholar 

  • Neufeld, J., & von Ins, M. (2011). Informed peer review and uninformed bibliometrics? Research Evaluation, 20(5), 365–375.

    Article  Google Scholar 

  • Neumann, R., & Tan, K. K. (2011). From PhD to initial employment: The doctorate in a knowledge economy. Studies in Higher Education, 36(5), 601–614.

    Article  Google Scholar 

  • Opthof, T., & Leydesdorff, L. (2011). A comment to the paper by Waltman et al. Scientometrics, 87, 467–481.

    Article  Google Scholar 

  • Pavitt, K. (1991). What makes basic research economic useful? Research Policy, 20, 109–119.

    Article  Google Scholar 

  • Pinheiro, D., Melkers, J., & Youtie, J. (2012). Learning to play the game: Student publishing as an indicator of future scholarly success. Technological Forecasting and Social Change, 81, 56–66.

    Article  Google Scholar 

  • Price, D. J. S. (1963). Little science, big science. New York: Columbia University Press.

    Google Scholar 

  • Ridgeway, G. (1999). The state of boosting. Computing Science and Statistics, 31, 172–181.

    Google Scholar 

  • Ridgeway, G. (2006). Assessing the effect of race bias in post-traffic stop outcomes using propensity scores. Journal of Quantitative Criminology, 22(1), 1–29.

    Article  Google Scholar 

  • Ridgeway, G. (2013). Package GBM. http://cran.r-project.org/web/packages/gbm/gbm.pdf. Acessed 18 November 2013.

  • Rigby, J. (2011). Systematic grant and funding body acknowledgement data for publications: New dimensions and new controversies for research policy and evaluation. Research Evaluation, 20(5), 365–375.

    Article  Google Scholar 

  • Rigby, J. (2013). Looking for the impact of peer review: Does count of funding acknowledgements really predict research impact? Scientometrics, 94, 57–73.

    Article  Google Scholar 

  • Roach, M., & Sauermann, H. (2010). A taste for science? PhD scientists’ academic orientation and self-selection into research careers in industry. Research Policy, 39, 422–434.

    Article  Google Scholar 

  • Rosenbaum, P., & Rubin, D. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1), 41–55.

    Article  MATH  MathSciNet  Google Scholar 

  • Salazar, H. J. (2010). Estudio sobre resultados e impactos de los programas de apoyo a la formación de posgrado en Colombia: hacia una agenda de evaluación de calidad. In L. Luchilo (Ed.), Formación de posgrado en América Latina : Políticas de apoyo resultados e impactos (pp. 117–176). Buenos Aires: Eudeba.

    Google Scholar 

  • Salter, A. J., & Martin, B. R. (2001). The economic benefits of publicly funded basic research: A critical review. Research Policy, 30, 509–532.

    Article  Google Scholar 

  • Schulz, P. A., & Manganote, E. J. T. (2012). Revisiting country research profiles: Learning about the scientific cultures. Scientometrics, 93, 517–531.

    Article  Google Scholar 

  • Statcom Estadísticos Consultores (STATCOM). (2007). Evaluación en profundidad—programas de becas de postgrado. Santiago de Chile: STATCOM.

    Google Scholar 

  • Thompson, D. F., Callen, E. C., & Nahata, M. C. (2009). New indices in scholarship assessment. American Journal of Pharmaceutical Education, 73(6), 1–5.

    Article  Google Scholar 

  • Van Raan, A. F. J. (1996). Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises. Scientometrics, 36, 397–420.

    Article  Google Scholar 

  • Van Raan, A. F. J. (2005). Measuring science. In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research: The use of publication and patent statistics in studies of S&T systems (pp. 19–50). EUA: Springer Science+Business Media.

    Google Scholar 

  • Vitae, (2010). What do researchers do? Doctoral graduate destinations and impact three years on 2010. RCUK: The Careers Research and Advisory Centre.

    Google Scholar 

  • Waltman, L., Van Eck, N. J., Van Leeuwen, T. N., Visser, M. S., & Van Raan, A. F. J. (2011). On the correlation between bibliometric indicators and peer review: Reply to Opthof and Leydesdorff. Scientometrics, 88, 1017–1022.

    Article  Google Scholar 

  • White, H. (1980). A heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity. Econometrica, 48, 817–830.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

We gratefully acknowledge support from FAPESP under Award Number 2008/58628-7. We would also like to thank Prof. Stan Metcalfe, Dr. Ronald Ramlogan, and Prof. Carlos Henrique Brito Cruz for their valuable comments. The findings and results are solely the responsibility of the authors.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adriana Bin.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bin, A., Salles-Filho, S., Capanema, L.M. et al. What difference does it make? Impact of peer-reviewed scholarships on scientific production. Scientometrics 102, 1167–1188 (2015). https://doi.org/10.1007/s11192-014-1462-9

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-014-1462-9

Keywords

  • Scholarships
  • Peer review
  • Scientific production
  • Impact
  • Evaluation
  • Propensity score

Mathematical Subject Classification

  • 62P25

JEL Classification

  • O380