Skip to main content
Log in

Assessing the scientific and technological output of EU Framework Programmes: evidence from the FP6 projects in the ICT field

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This paper provides a quantitative assessment of the scientific and technological productivity of FP6 projects by exploiting a new database on articles and patents resulting from EU funded projects. Starting from the FP6, the design of the European technology policy has undergone significant changes with the introduction of new funding instruments aimed at achieving a “critical mass” of resources. Our empirical results provide support to the concerns, expressed by several observers, regarding the fact that the new funding instruments may have resulted in artificially “too large” research consortia. The available empirical evidence shows that scientific productivity increases with the number of participants following a U-inverted shape, thereby indicating the existence of decreasing marginal returns to an increase in the size of research consortia. A second key result of the paper is related to the existence of significant differences of performance among funding instruments. In particular, after accounting for the larger amount of resources allocated to them, Integrated Projects perform less well in terms of scientific output than both STRePs and Networks of Excellence and they do not exhibit a superior performance than STRePs in terms of patent applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. The FP6 has been allocated a budget of around €19 billion, corresponding to 4% of the total EU government funded R&D budget, and has run from 2002 to the end of 2006. Around 65% of the FP6 budget has been allocated to the “thematic priority areas”, within which the new funding instruments have been deployed.

  2. STRePs involve smaller consortia and more narrowly focused research that is considered innovative within a predetermined workplan. They are self-contained and closest to the typical collaborative research traditionally supported by the FPs. Other funding instruments are the Integrated Infrastructure Initiatives (I3), Coordinated Actions (CAs) and Specific Support Actions (SSAs). They provide other forms of support or coordination to ongoing research efforts and areas of policy application in other instruments.

  3. Other notable trends have been the reduction in the number of funded projects from 12391 in FP5 to 5485 in FP6 (excluding Human Resource and Mobility Actions) and the decrease in the success rate of proposals from 26% in FP5 to 18% in FP6.

  4. Data have been collected through MS Excel spreadsheets sent to the project co-ordinators at the end of each year in which the project was active. Each spreadsheet provides a template for entering information into the appropriate fields, e.g. title of the article, publication year and so on. Unfortunately, very few project co-ordinators complied with the template and the raw data received from the Commission required an extensive work of cleaning and standardization.

  5. The Espacenet database (http://www.espacenet.com/index.en.htm) provides free access to more than 60 million patent documents all over the world.

  6. In this paper, we focus our attention on STRePs, IPs and NoEs. Other funding instruments, such as co-ordination actions (CA) and specific support actions (SSA), are less likely to generate published scientific output and/or patented inventions. Overall, the DG INFSO has funded in the FP6 a total of 1,230 projects, of which 669 STRePs, 253 IPs and 62 NoEs. Our sample includes only the STRePs, IPs and NoEs for which the co-ordinators have replied to the survey sent by the DG INFSO (i.e. 734 projects out of 984).

  7. Of these 273 patent applications, the vast majority (64%) were patent applications at the European Patent Office (EPO), followed by the WIPO-PCT (18%), the USPTO (11%) and the German Patent Office (4%).

  8. A possible way of approaching this problem is that of using the information on funding acknowledgements and other types of acknowledgements. As long as articles reported by project co-ordinators are truly the output of a specific EU project, those two fields should contain reference to it. Unfortunately, the Thomson-WoS database has started indexing this information only in very recent times. As a consequence, only 53 articles in the sample considered here reported a non-missing value for those fields. However, in all such cases acknowledgement of EU funding was reported.

  9. All estimates reported here have been obtained using Stata 10.

  10. The same regression analysis has been carried out by including the total amount of funding (in logs). However, given the high correlation with the number of participants, we carried out separate regressions.

  11. For more details, see http://cordis.europa.eu/ist/activities/activities.htm. The seven areas identified are Applied IST research addressing major societal and economic challenges, Communication, computing and software technologies, Components and microsystems, IST Future and emerging technologies, International co-operation, Knowledge and interface technologies, Nanotechnologies and nanosciences.

  12. Even without a formal benchmarking test, the share of patents by universities and PROs and the fraction of co-patents look remarkably larger than in the total population of patents in the corresponding technological classes. Indirectly, this result indicates that the data reported by project coordinators are to some extent reliable and might reflect the true output of funded projects.

  13. According to the most recent report monitoring the implementation of the FP7 during the first 2 years of operation (2007–2008), the average number of participants in Collaborative Projects is equal to around 11 compared to 18 for NoEs (http://ec.europa.eu/research/index.cfm?pg=reports).

References

  • Arnold, E. (2004). Evaluating research and innovation policy: A systems world needs systems evaluations. Research Evaluation, 13(1), 3–17.

    Article  Google Scholar 

  • Arnold, E., Clark, J., & Muscio, A. (2005). What the evaluation record tells us about Framework Programme performance. Science and Public Policy, 32, 385–397.

    Article  Google Scholar 

  • Arnold Report. (2009). Evaluation of the Sixth Framework Programmes for research and development 2002–2006. Bruxelles: European Commission.

    Google Scholar 

  • Butler, L. (2008). ICT assessment: Moving beyond journal outputs. Scientometrics, 74(1), 39–55.

    Article  Google Scholar 

  • Delanghe, H., Muldur, U., & Soete, L. (2009). European science and technology policy. Towards integration or fragmentation?. Cheltenham: Edward Elgar.

    Google Scholar 

  • Drott, M. C. (1999). Reexamining the role of conference papers in scholarly communication. Journal of the American Society for Information Science, 46(4), 299–305.

    Article  Google Scholar 

  • EPEC. (2009a). Assessment of the impact of the new instruments introduced in FP6. A study for the European Commission Research Directorate General. Bruxelles: European Commission.

    Google Scholar 

  • EPEC. (2009b). Bibliometric profiling of Framework Programme participants. A study for the European Commission Budget Directorate General. Bruxelles: European Commission.

    Google Scholar 

  • European Commission. (2000). Towards a European research area. Bruxelles: European Commission.

    Google Scholar 

  • Glänzel, W., Schlemmer, B., Schubert, A., & Thijs, B. (2006). Proceedings literature as additional data source for bibliometric analysis. Scientometrics, 68(3), 457–473.

    Article  Google Scholar 

  • Greene, W. H. (2002). Econometric analysis (5th ed.). New Jersey: Prentice Hall.

    Google Scholar 

  • Greene, W. H. (2008). Functional forms for the negative binomial model for count data. Economics Letters, 99(3), 585–590.

    Article  MathSciNet  Google Scholar 

  • Lisée, C., Larivière, V., & Archambault, E. (2008). Conference proceedings as a source of scientific information: A bibliometric analysis. Journal of the American Society for Information Science and Technology, 59(11), 1776–1784.

    Article  Google Scholar 

  • Marimon Report. (2004). Evaluation of the effectiveness of the new instruments of Frame-work Programme VI. Bruxelles: European Commission. http://cordis.europa.eu/fp6/instruments_review/.

  • Vanecek, J., Fatun, M., & Albrecht, V. (2010). Bibliometric evaluation of the FP-5 and FP-6 results in the Czech Republic. Scientometrics, 83(1), 103–114.

    Article  Google Scholar 

  • Vonortas, N. (2008). FP6 participation. Washington, DC: George Washington University.

    Google Scholar 

  • Winkelmann, R., & Zimmermann, K. F. (1995). Recent developments in count data modelling: Theory and application. Journal of Economic Surveys, 9(1), 1–24.

    Article  Google Scholar 

Download references

Acknowledgments

The data used for this research come from the DG INFSO, European Commission (service contract 32-CE-0224901/01-88). This paper is based on the results of our research and analysis and does not necessarily reflect the views of the European Commission. Thanks are due to Gianluca Tarasconi, Marinella Tortora, Stefano Spagnoli and Gözde Köse who provided invaluable research assistance.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stefano Breschi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Breschi, S., Malerba, F. Assessing the scientific and technological output of EU Framework Programmes: evidence from the FP6 projects in the ICT field. Scientometrics 88, 239–257 (2011). https://doi.org/10.1007/s11192-011-0378-x

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-011-0378-x

Keywords

Navigation