The evaluation of the Austrian START programme: an impact analysis of a research funding programme using a multi-method approach

Article
  • 70 Downloads

Abstract

The following article presents and discusses the approach and findings of a recently conducted evaluation study of the Austrian START programme. The START programme is one of Austria’s most prestigious research grants for individual researchers at post-doctoral level and provides the grantee with up to 1.2 million Euro for up to 5 years. The programme’s aims are twofold: supporting excellent research and qualifying the grantee for a (permanent) senior research position in the research system. The article discusses the effects of the programme and focuses especially on the impacts on the grantees as main beneficiaries. In particular, the scientific output of the grantees and their career development is investigated. Furthermore, the analysis of the indirect beneficiary groups and the analysis of the system in which the START programme is placed, aims at answering the questions whether and how the START programme has contributed to strengthening the capabilities of the Austrian science system. The evaluation uses a control group approach to quantify the effects on the grantees. In order to counterbalance the weaknesses of traditional quantitative impact analysis and to obtain a deeper understanding of the mechanisms of the effects of the funding, the evaluation was complemented by further evidence of a qualitative and quantitative nature.

Keywords

Programme evaluation Impact analysis Science policy Research funding Mixed methods approach Bibliometric analysis 

JEL Classification

03 038 I2 I23 I28 

References

  1. Archambault, É., VignolaGagné, E., Côté, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.CrossRefGoogle Scholar
  2. Befani, B., & Mayne, J. (2014). Process tracing and contribution analysis: A combined approach to generative causal inference for impact evaluation. IDS Bulletin, 45(6), 17–36.CrossRefGoogle Scholar
  3. Bogner, K., & Landrock, U. (2015). Antworttendenzen in standardisierten Umfragen. Mannheim, GESIS—Leibniz Institut für Sozialwissenschaften (SDM Survey Guidelines). doi:10.15465/sdm-sg_016.
  4. Böhmer, S., & Hornbostel, S. (2009). Postdocs in Deutschland: Nachwuchsgruppenleiterprogramme im Vergleich. Berlin: iFQ-Working Paper 6. http://www.forschungsinfo.de/publikationen/workingPaper.php#2009. Accessed 10 Aug 2016.
  5. Böhmer, S., Hornbostel, S., Meuser, M. (2008). Postdocs in Deutschland: Evaluation des Emmy Noether-Programms. Berlin: iFQ-Working Paper, 3. http://www.forschungsinfo.de/publikationen/download/working_paper_3_2008.pdf. Accessed 16 Aug 2016.
  6. Canibano, C., & Bozeman, B. (2009). Curriculum vitae method in science policy and research evaluation: The state-of-the-art. Research Evaluation 18(2), 86–94. http://rev.oxfordjournals.org/content/18/2/86.full.pdf#page=1&view=FitH. Accessed 20 Sept 2016.
  7. Chi, P.-S. (2013). Do non-source items make a difference in the social sciences? In Proceedings of ISSI 2013the 14th international conference of the international society of scientometrics and informetrics, Vienna, Austria, 7/15/2013.Google Scholar
  8. Conchi, S., & Michels, C. (2014). Scientific mobility—An analysis of Germany, Austria, France and Great Britain. Karlsruhe. Fraunhofer ISI Discussion Papers Innovation Systems and Policy Analysis (41), http://www.isi.fraunhofer.de/isi-de/p/publikationen/diskpap_innosysteme_policyanalyse.php. Accessed 10 Aug 2016.
  9. European Commission (Ed.). (2014a). Marie Curie researchers and their long-term career development: A comparative study. Written by Economisti Associati. Brussels. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/marie_curie_researchers_and_their_long-term_career_development_-_a_comparative_study.pdf#view=fit&pagemode=none. Accessed 26 Aug 2016.
  10. European Commission (Ed.). (2014b). Study on assessing the contribution of the framework programmes to the development of human research capacity. Prepared by IDEA Consult; iFG; PPMI. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/fp_hrc_study_final_report.pdf#view=fit&pagemode=none. Accessed 26 Aug 2016.
  11. FWF (Ed.). (2007). A contest between nations; or how far is Austrian research behind that of the world leaders? An analysis on the competivness of Austria’s scientific research in the natural and social sciences.Google Scholar
  12. Gerritsen, S., Plug, E., & van der Wiel, K. (2013): Up or out? How individual research grants affect academic careers in the Netherlands. In CPB Discussion Paper, 249. The Hagues https://www.cpb.nl/sites/default/files/publicaties/download/cpb-discussion-paper-249-or-out-how-individual-research-grants-affect-academic-careers-netherlands.pdf. Accessed 20 Sept 2016.
  13. Grubel, H. G. (1994). Brain drain, economics of. In T. Husen & T. Neville Postlethwaite (Eds.), The international encyclopedia of education I (pp. 554–561).Google Scholar
  14. Guthrie, S., Wamae, W., Diepeveen, S., Wooding, Steven, & Grant, Jonathan. (2013). Measuring research. A guide to research evaluation frameworks and tools. San Francisco: RAND Corporation.Google Scholar
  15. Hornbostel, S., Böhmer, S., Klingsporn, B., Neufeld, J., & von Ins, M. (2009). Funding of young scientist and scientific excellence. Scientometrics, 79(1), 171–190. doi:10.1007/s11192-009-0411-5.CrossRefGoogle Scholar
  16. Huber, N., Wegner, A., & Neufeld, J. (2015). MERCI (Monitoring European Research Council’s implementation of excellence): Evaluation report on the impact of the ERC Starting grant programme. Berlin: iFQ-Working Paper 16. http://www.forschungsinfo.de/Publikationen/Download/working_paper_16_2015.pdf. Accessed 20 Sept 2016.
  17. Joly, P.-B., Gaunand, A., Colinet, L., Larédo, P., Lemarié, S., & Matt, M. (2015). ASIRPA. A comprehensive theory-based approach to assessing the societal impacts of a research organization. Research Evaluation 24(4), 440–453. doi:10.1093/reseval/rvv015.
  18. Krumpel, I. (2013). Determinants of social desirability bias in sensitive surveys: A literature review. Quality and Quantity: International Journal of Methodology, 47(4), 2025–2047.CrossRefGoogle Scholar
  19. Leeuw, F., & Vaessen, J. (2009). Impact evaluations and development: NONIE guidance on impact evaluation. Network of Networks on Impact Evaluation (NONIE).Google Scholar
  20. Lowell, B. L. (2002). Skilled labour migration from developing countries: Annotated bibliography. International Migration Papers, 56, International Labour Office, Geneva. http://www.ilo.org/public/english/protection/migrant/download/imp/imp56e.pdf. Accessed 20 Sept 2016.
  21. Mayne, J. (2001). Addressing attribution through contribution analysis: Using performance measures sensibly. Canadian Journal of Program Evaluation, 16(1), 1–24.Google Scholar
  22. Mayne, J. (2012). Contribution analysis: Coming of age? Evaluation, 18(3), 270–280. doi:10.1177/1356389012451663.CrossRefGoogle Scholar
  23. Melin, G., & Danell, R. (2006). The top eight percent: Development of approved and rejected applicants for a prestigious grant in Sweden. Science and Public Policy, 33(10), 702–712.CrossRefGoogle Scholar
  24. Merton, R. K. (1973). The sociology of science. Chicago: The University of Chicago Press.Google Scholar
  25. Meyer, N., & Bührer, S. (2014). Impact evaluation of the Erwin Schrödinger fellowships with return phase. Final Report for the Austrian Science Fund (FWF). Karslruhe. https://ec.europa.eu/research/evaluations/pdf/archive/fp7-evidence-base/national_impact_studies/impact_evaluation_of_the_erwin_schroedinger_fellowships_with_return_phase.pdf. Accessed 26 Aug 2016.
  26. Morton, S. (2015). Progressing research impact assessment. A ‘contributions’ approach. Research Evaluation 24(4), 405–419. doi: 10.1093/reseval/rvv016.
  27. Nedeva, M., Braun, D., Edler, J., Frischer, D., Glanz, M., & Gläser, J. et al. (2012). Understanding and assessing the impact and outcomes of the ERC and its funding schemes (EURECIA). Final Synthesis Report.Google Scholar
  28. Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.CrossRefGoogle Scholar
  29. Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact. A review. Research Evaluation, 23(1), 21–32. doi:10.1093/reseval/rvt021.CrossRefGoogle Scholar
  30. Preisendörfer, P., & Wolter, F. (2014). Who is telling the truth? A validation study on determinants of response behaviour in surveys. Public Opinion Quarterly, 78(1), 126–146.CrossRefGoogle Scholar
  31. PREST (Ed.) (2002). Assessing the socio-economic impacts of the framework programme. Manchester. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/assessing_the_socio_economic_impacts_of_the_framework_programme_2002.pdf. Accessed 16 Aug 2016.
  32. Reckling, F. J., & Fischer, C. (2010). Factors influencing approval probability in FWF decision-making procedures. FWF Stand-Alone Projects Programme, 1999 to 2008. FWF Discussion Paper.Google Scholar
  33. Roessner, D. (2000). Quantitative and qualitative methodes in the evaluation of research. Research Evaluation, 8(2), 125–132.CrossRefGoogle Scholar
  34. Seus, S., Heckl, E., & Bührer, S. (2016). Evaluation of the START Programme and the Wittgenstein Award. doi:10.5281/zenodo.50610. https://zenodo.org/record/50610/files/Eval-START-Witt_final_report.pdf. Accessed 13 Feb 2017.
  35. Spaapen, J., & van Drooge, L. (2011). Introducing ‘productive interactions’ in social impact assessment. Research Evaluation, 20(3), 211–218. doi:10.3152/095820211X12941371876742.CrossRefGoogle Scholar
  36. van Arensbergen, Pleur (2014): Talent Proof Selection Processes in Research Funding and Careers. Dissertation. Den Haag: Rathenau Instituut.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Fraunhofer ISIKarlsruheGermany
  2. 2.Eva Heckl KMU Forschung AustriaViennaAustria

Personalised recommendations