The evaluation of the Austrian START programme: an impact analysis of a research funding programme using a multi-method approach
- 163 Downloads
The following article presents and discusses the approach and findings of a recently conducted evaluation study of the Austrian START programme. The START programme is one of Austria’s most prestigious research grants for individual researchers at post-doctoral level and provides the grantee with up to 1.2 million Euro for up to 5 years. The programme’s aims are twofold: supporting excellent research and qualifying the grantee for a (permanent) senior research position in the research system. The article discusses the effects of the programme and focuses especially on the impacts on the grantees as main beneficiaries. In particular, the scientific output of the grantees and their career development is investigated. Furthermore, the analysis of the indirect beneficiary groups and the analysis of the system in which the START programme is placed, aims at answering the questions whether and how the START programme has contributed to strengthening the capabilities of the Austrian science system. The evaluation uses a control group approach to quantify the effects on the grantees. In order to counterbalance the weaknesses of traditional quantitative impact analysis and to obtain a deeper understanding of the mechanisms of the effects of the funding, the evaluation was complemented by further evidence of a qualitative and quantitative nature.
KeywordsProgramme evaluation Impact analysis Science policy Research funding Mixed methods approach Bibliometric analysis
JEL Classification03 038 I2 I23 I28
- Bogner, K., & Landrock, U. (2015). Antworttendenzen in standardisierten Umfragen. Mannheim, GESIS—Leibniz Institut für Sozialwissenschaften (SDM Survey Guidelines). doi: 10.15465/sdm-sg_016.
- Böhmer, S., & Hornbostel, S. (2009). Postdocs in Deutschland: Nachwuchsgruppenleiterprogramme im Vergleich. Berlin: iFQ-Working Paper 6. http://www.forschungsinfo.de/publikationen/workingPaper.php#2009. Accessed 10 Aug 2016.
- Böhmer, S., Hornbostel, S., Meuser, M. (2008). Postdocs in Deutschland: Evaluation des Emmy Noether-Programms. Berlin: iFQ-Working Paper, 3. http://www.forschungsinfo.de/publikationen/download/working_paper_3_2008.pdf. Accessed 16 Aug 2016.
- Canibano, C., & Bozeman, B. (2009). Curriculum vitae method in science policy and research evaluation: The state-of-the-art. Research Evaluation 18(2), 86–94. http://rev.oxfordjournals.org/content/18/2/86.full.pdf#page=1&view=FitH. Accessed 20 Sept 2016.
- Chi, P.-S. (2013). Do non-source items make a difference in the social sciences? In Proceedings of ISSI 2013—the 14th international conference of the international society of scientometrics and informetrics, Vienna, Austria, 7/15/2013.Google Scholar
- Conchi, S., & Michels, C. (2014). Scientific mobility—An analysis of Germany, Austria, France and Great Britain. Karlsruhe. Fraunhofer ISI Discussion Papers Innovation Systems and Policy Analysis (41), http://www.isi.fraunhofer.de/isi-de/p/publikationen/diskpap_innosysteme_policyanalyse.php. Accessed 10 Aug 2016.
- European Commission (Ed.). (2014a). Marie Curie researchers and their long-term career development: A comparative study. Written by Economisti Associati. Brussels. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/marie_curie_researchers_and_their_long-term_career_development_-_a_comparative_study.pdf#view=fit&pagemode=none. Accessed 26 Aug 2016.
- European Commission (Ed.). (2014b). Study on assessing the contribution of the framework programmes to the development of human research capacity. Prepared by IDEA Consult; iFG; PPMI. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/fp_hrc_study_final_report.pdf#view=fit&pagemode=none. Accessed 26 Aug 2016.
- FWF (Ed.). (2007). A contest between nations; or how far is Austrian research behind that of the world leaders? An analysis on the competivness of Austria’s scientific research in the natural and social sciences.Google Scholar
- Gerritsen, S., Plug, E., & van der Wiel, K. (2013): Up or out? How individual research grants affect academic careers in the Netherlands. In CPB Discussion Paper, 249. The Hagues https://www.cpb.nl/sites/default/files/publicaties/download/cpb-discussion-paper-249-or-out-how-individual-research-grants-affect-academic-careers-netherlands.pdf. Accessed 20 Sept 2016.
- Grubel, H. G. (1994). Brain drain, economics of. In T. Husen & T. Neville Postlethwaite (Eds.), The international encyclopedia of education I (pp. 554–561).Google Scholar
- Guthrie, S., Wamae, W., Diepeveen, S., Wooding, Steven, & Grant, Jonathan. (2013). Measuring research. A guide to research evaluation frameworks and tools. San Francisco: RAND Corporation.Google Scholar
- Huber, N., Wegner, A., & Neufeld, J. (2015). MERCI (Monitoring European Research Council’s implementation of excellence): Evaluation report on the impact of the ERC Starting grant programme. Berlin: iFQ-Working Paper 16. http://www.forschungsinfo.de/Publikationen/Download/working_paper_16_2015.pdf. Accessed 20 Sept 2016.
- Joly, P.-B., Gaunand, A., Colinet, L., Larédo, P., Lemarié, S., & Matt, M. (2015). ASIRPA. A comprehensive theory-based approach to assessing the societal impacts of a research organization. Research Evaluation 24(4), 440–453. doi: 10.1093/reseval/rvv015.
- Leeuw, F., & Vaessen, J. (2009). Impact evaluations and development: NONIE guidance on impact evaluation. Network of Networks on Impact Evaluation (NONIE).Google Scholar
- Lowell, B. L. (2002). Skilled labour migration from developing countries: Annotated bibliography. International Migration Papers, 56, International Labour Office, Geneva. http://www.ilo.org/public/english/protection/migrant/download/imp/imp56e.pdf. Accessed 20 Sept 2016.
- Mayne, J. (2001). Addressing attribution through contribution analysis: Using performance measures sensibly. Canadian Journal of Program Evaluation, 16(1), 1–24.Google Scholar
- Merton, R. K. (1973). The sociology of science. Chicago: The University of Chicago Press.Google Scholar
- Meyer, N., & Bührer, S. (2014). Impact evaluation of the Erwin Schrödinger fellowships with return phase. Final Report for the Austrian Science Fund (FWF). Karslruhe. https://ec.europa.eu/research/evaluations/pdf/archive/fp7-evidence-base/national_impact_studies/impact_evaluation_of_the_erwin_schroedinger_fellowships_with_return_phase.pdf. Accessed 26 Aug 2016.
- Morton, S. (2015). Progressing research impact assessment. A ‘contributions’ approach. Research Evaluation 24(4), 405–419. doi: 10.1093/reseval/rvv016.
- Nedeva, M., Braun, D., Edler, J., Frischer, D., Glanz, M., & Gläser, J. et al. (2012). Understanding and assessing the impact and outcomes of the ERC and its funding schemes (EURECIA). Final Synthesis Report.Google Scholar
- PREST (Ed.) (2002). Assessing the socio-economic impacts of the framework programme. Manchester. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/assessing_the_socio_economic_impacts_of_the_framework_programme_2002.pdf. Accessed 16 Aug 2016.
- Reckling, F. J., & Fischer, C. (2010). Factors influencing approval probability in FWF decision-making procedures. FWF Stand-Alone Projects Programme, 1999 to 2008. FWF Discussion Paper.Google Scholar
- Seus, S., Heckl, E., & Bührer, S. (2016). Evaluation of the START Programme and the Wittgenstein Award. doi: 10.5281/zenodo.50610. https://zenodo.org/record/50610/files/Eval-START-Witt_final_report.pdf. Accessed 13 Feb 2017.
- van Arensbergen, Pleur (2014): Talent Proof Selection Processes in Research Funding and Careers. Dissertation. Den Haag: Rathenau Instituut.Google Scholar