Abstract
The following article presents and discusses the approach and findings of a recently conducted evaluation study of the Austrian START programme. The START programme is one of Austria’s most prestigious research grants for individual researchers at post-doctoral level and provides the grantee with up to 1.2 million Euro for up to 5 years. The programme’s aims are twofold: supporting excellent research and qualifying the grantee for a (permanent) senior research position in the research system. The article discusses the effects of the programme and focuses especially on the impacts on the grantees as main beneficiaries. In particular, the scientific output of the grantees and their career development is investigated. Furthermore, the analysis of the indirect beneficiary groups and the analysis of the system in which the START programme is placed, aims at answering the questions whether and how the START programme has contributed to strengthening the capabilities of the Austrian science system. The evaluation uses a control group approach to quantify the effects on the grantees. In order to counterbalance the weaknesses of traditional quantitative impact analysis and to obtain a deeper understanding of the mechanisms of the effects of the funding, the evaluation was complemented by further evidence of a qualitative and quantitative nature.
Similar content being viewed by others
Notes
As Roessner (2000) already showed, the terms inputs, outputs, outcomes, and impacts are not always used distinctively and without any overlaps.
The concept of "brain drain" refers to scientists who migrate from one country to another with no intention of returning (Grubel 1994)—an action that has international, economic and political impacts, especially in developing countries (see e.g. Lowell 2002). The brain drain approach argues that countries lose human capital if scientists go overseas to study or work, as they might decide to remain there. Thus, the emigration of highly skilled scientists results in a human capital loss ("brain drain") for the former home country and human capital earning ("brain gain") for the respective host country.
Fraunhofer ISI has implemented an Oracle-SQL version of this database and systematically added further data and information to the database. Among the extensions are regionalisation (NUTS1, NUTS2, and NUTS3) of the EU-27 Member States or the definition of the researcher’s sex via the first name.
It was not possible to conduct a meaningful bibliometric analysis on the comparison group as the group of people was too small (n = 49) and time spans too restricted to be able to conduct a before—during—and after analysis for this group.
As the selection of the control group also controlled for the discipline, the analysis used the citation rate of each researcher and not the field-weighted ones. The analysis of the citation rate is an average of all citation rates of the START group and the control group. It therefore gives only an indication of changes, but not of productivity in individual disciplines.
* The differences in the number of grantees and control group researchers included in the sample is due to the fact that it was not possible to generate a twin for each START grantee or that data were missing. The drop of persons between the three periods of analysis is due to the longitudinal design of the analysis and the fact that not all grantees have finished the funding period yet.
The Mann–Whitney U Test has been used to calculate the similarities between the two groups.
Fisher’s Test, significance level p < 0.05.
For this study three different professorships have been distinguished, based on the former and recent Austrian research system: full professorship (Universitäts- or FH-ProfessorIn); Associate Professor (Assozierter(e) ProfessorIn, ehem. DozentIn); Assistant Professor (AssistenzprofessorIn).
However, this result has to be interpreted with caution, as it is based on a relatively small number of survey respondents for all three groups: n(start) = 64; n(control group) = 41; n(comparison group) = 17.
Pearson correlation index for the START grantees is −0.705; for the CG −0.556); no data for the comparison group is available, as data for this group exists only from 2007 onward.
The Wittgenstein award provides recognition and support to excellent scientists who have already produced exceptional scientific work and who occupy a prominent place in the international community and have a permanent position as professor in one of Austria’s research organisation. It is the most generously supported research award in Austria.
This obligation was introduced as both programmes share a lot of common features. In order to not infringe upon the principle of no-double funding at the EU and national level, the double application was found as a way forward. In case both applications are successful, the START funding is not granted. However, the successful candidate can keep the title of “START grantee” and receives a supplementary funding for approx. 1 year that tops up the slightly less well funded ERC.
References
Archambault, É., VignolaGagné, E., Côté, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.
Befani, B., & Mayne, J. (2014). Process tracing and contribution analysis: A combined approach to generative causal inference for impact evaluation. IDS Bulletin, 45(6), 17–36.
Bogner, K., & Landrock, U. (2015). Antworttendenzen in standardisierten Umfragen. Mannheim, GESIS—Leibniz Institut für Sozialwissenschaften (SDM Survey Guidelines). doi:10.15465/sdm-sg_016.
Böhmer, S., & Hornbostel, S. (2009). Postdocs in Deutschland: Nachwuchsgruppenleiterprogramme im Vergleich. Berlin: iFQ-Working Paper 6. http://www.forschungsinfo.de/publikationen/workingPaper.php#2009. Accessed 10 Aug 2016.
Böhmer, S., Hornbostel, S., Meuser, M. (2008). Postdocs in Deutschland: Evaluation des Emmy Noether-Programms. Berlin: iFQ-Working Paper, 3. http://www.forschungsinfo.de/publikationen/download/working_paper_3_2008.pdf. Accessed 16 Aug 2016.
Canibano, C., & Bozeman, B. (2009). Curriculum vitae method in science policy and research evaluation: The state-of-the-art. Research Evaluation 18(2), 86–94. http://rev.oxfordjournals.org/content/18/2/86.full.pdf#page=1&view=FitH. Accessed 20 Sept 2016.
Chi, P.-S. (2013). Do non-source items make a difference in the social sciences? In Proceedings of ISSI 2013—the 14th international conference of the international society of scientometrics and informetrics, Vienna, Austria, 7/15/2013.
Conchi, S., & Michels, C. (2014). Scientific mobility—An analysis of Germany, Austria, France and Great Britain. Karlsruhe. Fraunhofer ISI Discussion Papers Innovation Systems and Policy Analysis (41), http://www.isi.fraunhofer.de/isi-de/p/publikationen/diskpap_innosysteme_policyanalyse.php. Accessed 10 Aug 2016.
European Commission (Ed.). (2014a). Marie Curie researchers and their long-term career development: A comparative study. Written by Economisti Associati. Brussels. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/marie_curie_researchers_and_their_long-term_career_development_-_a_comparative_study.pdf#view=fit&pagemode=none. Accessed 26 Aug 2016.
European Commission (Ed.). (2014b). Study on assessing the contribution of the framework programmes to the development of human research capacity. Prepared by IDEA Consult; iFG; PPMI. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/fp_hrc_study_final_report.pdf#view=fit&pagemode=none. Accessed 26 Aug 2016.
FWF (Ed.). (2007). A contest between nations; or how far is Austrian research behind that of the world leaders? An analysis on the competivness of Austria’s scientific research in the natural and social sciences.
Gerritsen, S., Plug, E., & van der Wiel, K. (2013): Up or out? How individual research grants affect academic careers in the Netherlands. In CPB Discussion Paper, 249. The Hagues https://www.cpb.nl/sites/default/files/publicaties/download/cpb-discussion-paper-249-or-out-how-individual-research-grants-affect-academic-careers-netherlands.pdf. Accessed 20 Sept 2016.
Grubel, H. G. (1994). Brain drain, economics of. In T. Husen & T. Neville Postlethwaite (Eds.), The international encyclopedia of education I (pp. 554–561).
Guthrie, S., Wamae, W., Diepeveen, S., Wooding, Steven, & Grant, Jonathan. (2013). Measuring research. A guide to research evaluation frameworks and tools. San Francisco: RAND Corporation.
Hornbostel, S., Böhmer, S., Klingsporn, B., Neufeld, J., & von Ins, M. (2009). Funding of young scientist and scientific excellence. Scientometrics, 79(1), 171–190. doi:10.1007/s11192-009-0411-5.
Huber, N., Wegner, A., & Neufeld, J. (2015). MERCI (Monitoring European Research Council’s implementation of excellence): Evaluation report on the impact of the ERC Starting grant programme. Berlin: iFQ-Working Paper 16. http://www.forschungsinfo.de/Publikationen/Download/working_paper_16_2015.pdf. Accessed 20 Sept 2016.
Joly, P.-B., Gaunand, A., Colinet, L., Larédo, P., Lemarié, S., & Matt, M. (2015). ASIRPA. A comprehensive theory-based approach to assessing the societal impacts of a research organization. Research Evaluation 24(4), 440–453. doi:10.1093/reseval/rvv015.
Krumpel, I. (2013). Determinants of social desirability bias in sensitive surveys: A literature review. Quality and Quantity: International Journal of Methodology, 47(4), 2025–2047.
Leeuw, F., & Vaessen, J. (2009). Impact evaluations and development: NONIE guidance on impact evaluation. Network of Networks on Impact Evaluation (NONIE).
Lowell, B. L. (2002). Skilled labour migration from developing countries: Annotated bibliography. International Migration Papers, 56, International Labour Office, Geneva. http://www.ilo.org/public/english/protection/migrant/download/imp/imp56e.pdf. Accessed 20 Sept 2016.
Mayne, J. (2001). Addressing attribution through contribution analysis: Using performance measures sensibly. Canadian Journal of Program Evaluation, 16(1), 1–24.
Mayne, J. (2012). Contribution analysis: Coming of age? Evaluation, 18(3), 270–280. doi:10.1177/1356389012451663.
Melin, G., & Danell, R. (2006). The top eight percent: Development of approved and rejected applicants for a prestigious grant in Sweden. Science and Public Policy, 33(10), 702–712.
Merton, R. K. (1973). The sociology of science. Chicago: The University of Chicago Press.
Meyer, N., & Bührer, S. (2014). Impact evaluation of the Erwin Schrödinger fellowships with return phase. Final Report for the Austrian Science Fund (FWF). Karslruhe. https://ec.europa.eu/research/evaluations/pdf/archive/fp7-evidence-base/national_impact_studies/impact_evaluation_of_the_erwin_schroedinger_fellowships_with_return_phase.pdf. Accessed 26 Aug 2016.
Morton, S. (2015). Progressing research impact assessment. A ‘contributions’ approach. Research Evaluation 24(4), 405–419. doi: 10.1093/reseval/rvv016.
Nedeva, M., Braun, D., Edler, J., Frischer, D., Glanz, M., & Gläser, J. et al. (2012). Understanding and assessing the impact and outcomes of the ERC and its funding schemes (EURECIA). Final Synthesis Report.
Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.
Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact. A review. Research Evaluation, 23(1), 21–32. doi:10.1093/reseval/rvt021.
Preisendörfer, P., & Wolter, F. (2014). Who is telling the truth? A validation study on determinants of response behaviour in surveys. Public Opinion Quarterly, 78(1), 126–146.
PREST (Ed.) (2002). Assessing the socio-economic impacts of the framework programme. Manchester. https://ec.europa.eu/research/evaluations/pdf/archive/other_reports_studies_and_documents/assessing_the_socio_economic_impacts_of_the_framework_programme_2002.pdf. Accessed 16 Aug 2016.
Reckling, F. J., & Fischer, C. (2010). Factors influencing approval probability in FWF decision-making procedures. FWF Stand-Alone Projects Programme, 1999 to 2008. FWF Discussion Paper.
Roessner, D. (2000). Quantitative and qualitative methodes in the evaluation of research. Research Evaluation, 8(2), 125–132.
Seus, S., Heckl, E., & Bührer, S. (2016). Evaluation of the START Programme and the Wittgenstein Award. doi:10.5281/zenodo.50610. https://zenodo.org/record/50610/files/Eval-START-Witt_final_report.pdf. Accessed 13 Feb 2017.
Spaapen, J., & van Drooge, L. (2011). Introducing ‘productive interactions’ in social impact assessment. Research Evaluation, 20(3), 211–218. doi:10.3152/095820211X12941371876742.
van Arensbergen, Pleur (2014): Talent Proof Selection Processes in Research Funding and Careers. Dissertation. Den Haag: Rathenau Instituut.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Seus, S., Bührer, S. The evaluation of the Austrian START programme: an impact analysis of a research funding programme using a multi-method approach. J Technol Transf 47, 673–698 (2022). https://doi.org/10.1007/s10961-017-9606-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10961-017-9606-8
Keywords
- Programme evaluation
- Impact analysis
- Science policy
- Research funding
- Mixed methods approach
- Bibliometric analysis