Skip to main content
Log in

Effects of large-scale research funding programs: a Japanese case study

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This study investigates the effects of large-scale research funding from the Japanese government on the research outcomes of university researchers. To evaluate the effects, we use the difference-in-differences estimator and measure research outcomes in terms of number of papers and citation counts per paper. Our analysis shows that the funding program led to an increase in the number of papers in some fields and an increase in the citation counts in the other fields. A comparison of our estimation results with assessment data obtained from peer reviews showed important differences. Since the characteristics of research vary according to the field, bibliometrics analysis should be used along with the peer review method for a more accurate analysis of research impact.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. Details are given on the official site of the 21st COE Program (http://www.jsps.go.jp/english/e-21coe/index.html) of the Japan Society for the Promotion of Science.

  2. This condition is weaker than the random assignment condition in Eq. 2.

  3. A drawback of the Scopus database is that it includes citation counts only since 1996. However, since we have used data only after 1997, this drawback did not affect our analysis.

  4. Although, we do not use the papers and citations that are not included in the source documents in Scopus, the ratio of papers not contained in Scopus is equal between the treatment and control groups within a field; these are offset by the DID estimation, which does not influence the estimation results.

  5. We used 1998 as the base year for the fields incorporated in the COE program of FY 2002, and 1999 for the fields incorporated in the COE program of FY 2003.

  6. In future, we may perform a DID estimation using the number of Japanese books in these fields along with English only journal papers and citations, from another database.

References

  • Anderson, R., Narin, F., & McAllister, P. (1978). Publication ratings versus peer ratings of universities. Journal of the American Society for Information Science, 29(2), 91–103.

    Article  Google Scholar 

  • Bozeman, B., & Gaughan, M. (2007). Impacts of grants and contracts on academic researchers’ interactions with industry. Research Policy, 36(5), 694–707.

    Article  Google Scholar 

  • Cameron, A., & Trivedi, P. (2005). Microeconometrics: Methods and applications (pp. 860–898). New York: Cambridge University Press.

    Book  Google Scholar 

  • Committee for Research Evaluations. (2008). Present status of research evaluations and its future in Japan. Science Council of Japan. http://www.scj.go.jp/ja/info/kohyo/pdf/kohyo-20-t51-3e.pdf

  • Dawid, A. (1979). Conditional independence in statistical theory. Journal of the Royal Statistical Society Series B (Methodological), 41(1), 1–31.

    MathSciNet  MATH  Google Scholar 

  • Dietz, J., & Bozeman, B. (2005). Academic careers, patents, and productivity: industry experience as scientific and technical human capital. Research Policy, 34(3), 349–367.

    Article  Google Scholar 

  • Gaughan, M. (2009). Using the curriculum vitae for policy research: an evaluation of national institutes of health center and training support on career trajectories. Research Evaluation, 18(2), 117–124.

    Article  Google Scholar 

  • Gaughan, M., & Bozeman, B. (2002). Using curriculum vitae to compare some impacts of NSF research grants with research center funding. Research Evaluation, 11(1), 17–26.

    Article  Google Scholar 

  • Gaughan, M., & Ponomariov, B. (2008). Faculty publication productivity, collaboration, and grants velocity: using curricula vitae to compare center-affiliated and unaffiliated scientists. Research Evaluation, 17(2), 103–110.

    Article  Google Scholar 

  • Gibbons, M., & Georghiou, L. (1987). Evaluation of research: A selection of current practices. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Haddow, G., & Genoni, P. (2010). Citation analysis and peer ranking of Australian social science journals. Scientometrics, 85(2), 471–487.

    Article  Google Scholar 

  • Hall, B., Jaffe, A., & Trajtenberg, M. (2000). Market value and patent citations: A first look. NBER Working Paper No 7741. http://www.nber.org/papers/w7741.pdf?new_window=1

  • Hall, B., Jaffe, A., & Trajtenberg, M. (2001). The NBER patent citations data file: Lessons, insights and methodological tools. NBER Working Paper No 8498. http://www.nber.org/papers/w8498.pdf?new_window=1

  • Harhoff, D., Narin, F., Scherer, F., & Vopel, K. (1999). Citation frequency and the value of patented inventions. The Review of Economics and Statistics, 81(3), 511–515.

    Article  Google Scholar 

  • Heckman, J., & Robb, R. (1985). Alternative methods for evaluating the impact of interventions. In: J. Heckman & B. Singer (eds.), Longitudinal analysis of labor market data (pp. 156–245). New York: Cambridge University Press.

    Chapter  Google Scholar 

  • Jaffe, A., & Lerner, J. (2001). Reinviting public R&D: patent policy and the commercialization of national laboratory technologies. RAND Journal of Economics, 32(1), 167–198.

    Article  Google Scholar 

  • Jaffe, A., & Trajtenberg, M. (1996). Flows of knowledge from universities and federal laboratories: modeling the flow of patent citations over time and across institutional and geographic boundaries. Proceedings of the National Academy of Sciences, 93(23), 12671–12677.

    Article  Google Scholar 

  • Jaffe, A., Trajtenberg, M., & Fogarty, M. (2000). The Meaning of patent citations: Report of the NBER/case-western reserve survey of patentees. NBER Working Paper No 7631. http://www.nber.org/papers/w7631.pdf?new_window=1

  • Kostoff, R. (1994). Federal research impact assessment: state-of-the-art. Journal of the American Society for Information Science, 45(6), 428–440.

    Article  Google Scholar 

  • Lanjouw, J., & Schankerman, M. (1999). The Quality of ideas: Measuring innovation with multiple indicators. NBER Working Paper No 7345. http://www.nber.org/papers/w7345.pdf?new_window=1

  • Lee, M.-J. (2005). Micro-econometrics for policy, program, and treatment effects. New York: Oxford University Press.

    Book  MATH  Google Scholar 

  • Meho, L., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.

    Article  Google Scholar 

  • Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the web of science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.

    Article  Google Scholar 

  • Oppenheim, C. (1995). The correlation between citation counts and the 1992 research assessment exercise ratings for British library and information science university departments. Journal of Documentation, 51(1), 18–27.

    Article  MathSciNet  Google Scholar 

  • Oppenheim, C. (1997). The correlation between citation counts and the 1992 research assessment exercise ratings for british research in genetics. Anatomy and archaeology. Journal of Documentation, 53(5), 477–487.

    Article  MathSciNet  Google Scholar 

  • Opthof, T., & Leydesdorff, L. (2011). A comment to the paper by Waltman et al. Scientometrics 87 467–481, 2011. Scientometrics, 88(3), 1011–1016.

    Article  Google Scholar 

  • Rinia, E., van Leeuwen, Th, van Vuren, H., & van Raan, A. (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria evaluation of condensed matter physics in the Netherlands. Research Policy, 27(1), 95–107.

    Article  Google Scholar 

  • Rinia, E., van Leeuwen, Th, van Vuren, H., & van Raan, A. (2001). Influence of interdisciplinarity on peer-review and bibliometric evaluations in physics research. Research Policy, 30(3), 357–361.

    Article  Google Scholar 

  • Rosenbaum, P., & Rubin, D. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1), 41–55.

    Article  MathSciNet  MATH  Google Scholar 

  • Stock, J., & Watson, M. (2007). Introduction to econometrics: International edition (2nd edn., pp. 468–519). Boston: Pearson/Addison-Wesley.

    Google Scholar 

  • Van Raan, A. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics, 67(3), 491–502.

    Google Scholar 

  • Wooldridge, J. (2002). Econometric analysis of cross section and panel data (pp. 603–643). Cambridge: The MIT Press.

    MATH  Google Scholar 

  • Zhu, J., Meadows, A., & Mason, G. (1991). Citations and departmental research ratings. Scientometrics, 21(2), 171–179.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Naomi Fukuzawa.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ida, T., Fukuzawa, N. Effects of large-scale research funding programs: a Japanese case study. Scientometrics 94, 1253–1273 (2013). https://doi.org/10.1007/s11192-012-0841-3

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-012-0841-3

Keywords

Navigation