Advertisement

Minerva

, Volume 53, Issue 2, pp 89–115 | Cite as

Research Portfolio Analysis in Science Policy: Moving from Financial Returns to Societal Benefits

  • Matthew L. Wallace
  • Ismael Rafols
Article

Abstract

Funding agencies and large public scientific institutions are increasingly using the term “research portfolio” as a means of characterizing their research. While portfolios have long been used as a heuristic for managing corporate R&D (i.e. R&D aimed at gaining tangible economic benefits), they remain ill-defined in a science policy context where research is aimed at achieving societal outcomes. In this article we analyze the discursive uses of the term “research portfolio” and propose some general considerations for their application in science policy. We explore the use of the term in private R&D and related scholarly literature in existing science policy practices, and seek insight in relevant literature in science policy scholarship. While the financial analogy can in some instances be instructive, a simple transposition from the world of finance or of corporate R&D to public research is problematic. However, we do identify potentially fruitful uses of portfolio analysis in science policy. In particular, our review suggests that the concept of research portfolio can indeed be a useful analytical instrument for tackling complex societal challenges. Specifically, the strands of scholarship identified suggest that the use of research portfolio should: i) recognize the diversity of research lines relevant for a given societal challenge, given the uncertainty and ambiguity of research outcomes; ii) examine the relationships between research options of a portfolio and the expected societal outcomes; and iii) adopt a systemic perspective to research portfolios – i.e. examine a portfolio as a functional whole, rather than as the sum of its parts. We argue that with these considerations, portfolio-driven approaches may foster social inclusion in science policy decisions, help deliberation between “alternative” portfolios to tackle complex societal challenges, as well as promote cost-effectiveness and transparency.

Keywords

Research portfolio Prioritisation Research landscape Societal challenges 

Notes

Acknowledgements

We thank Tommaso Ciarli, Jochen Gläser, Jordi Molas-Gallart, Richard Wooley, and two anonymous referees for their insightful comments and suggestions. We acknowledge support from the UK Economic and Social Research Council (Grant RES-360-25-0076, Mapping the Development of Emergent Technologies) and the FP7 EU Marie Curie Integration Grant to IR (MapRePort).

References

  1. Agarwal, Pankaj, and David B Searls. 2009. Can literature analysis identify innovation drivers in drug discovery? Nature Reviews. Drug Discovery 8. Nature Publishing Group: 865–78. doi: 10.1038/nrd2973.
  2. Altman, Edward I., and Anthony Saunders. 1998. Credit risk measurement: Developments over the last 20 years. Journal of Banking and Finance 21: 1721–1742.CrossRefGoogle Scholar
  3. Arnold, Erik. 2004. Evaluating research and innovation policy: A systems world needs systems evaluations. Research Evaluation 13: 3–17.CrossRefGoogle Scholar
  4. Awerbuch, Shimon. 2006. Portfolio-Based Electricity Generation Planning: Policy Implications For Renewables And Energy Security. Mitigation and Adaptation Strategies for Global Change 11: 693–710. doi: 10.1007/s11027-006-4754-4.CrossRefGoogle Scholar
  5. Bazilian, Morgan, and Fabien Roque (eds.). 2008. No Analytical Methods for Energy Diversity and Security: Portfolio Optimization in the Energy Sector: A Tribute to the Work of Dr. Shimon Awerbuch. Amsterdam: Elsevier.Google Scholar
  6. Bernard Cohen, I. 1993. Analogy, Homology, and Metaphor in the Interactions between the Natural Sciences and the Social Sciences, Especially Economics. In Non-natural Social Science: Reflecting on the Enterprise of More Heat than Light, ed. Neil de Marchi, 7–44. Durham: Duke University Press.Google Scholar
  7. Boaz, Annette, Siobhan Fitzpatrick, and Ben Shaw. 2008. Assessing the impact of research on policy: A review of the literature for a project on bridging research and policy through outcome evaluation Final report with references and appendices, February 2008.Google Scholar
  8. Boyack, Kevin W., and Paul Jordan. 2011. Metrics associated with NIH funding: A high-level view. JAMIA 18: 423–431. doi: 10.1136/amiajnl-2011-000213.Google Scholar
  9. Bozeman, Barry, and Juan Rogers. 2001. Strategic management of government-sponsored R&D portfolios. Environment and Planning C: Government and Policy 19: 413–442. doi: 10.1068/c1v.CrossRefGoogle Scholar
  10. Bozeman, Barry, and Daniel Sarewitz. 2005. Public values and public failure in US science policy. Science and Public Policy 32(2): 119–136.CrossRefGoogle Scholar
  11. Bozeman, Barry, and Daniel Sarewitz. 2011. Public Value Mapping and Science Policy Evaluation. Minerva 49(1): 1–23. doi: 10.1007/s11024-011-9161-7.CrossRefGoogle Scholar
  12. Brooks, Harvey. 1978. The Problem of Research Priorities. Daedelus 107: 171–190.Google Scholar
  13. Buxton, Martin, Leonie Sundmacher, Jorge Mestre-Ferrandiz, Liz Allen, Nick Black, David Cox, Helen Munn, Briony Rayfield, Eddy Nason, and Jon Sussex. 2008. Medical Research: What's it worth? Estimating the economic benefits from medical research in the UK. London: Health Economics Research Group, Office of Health Economics, RAND Europe.Google Scholar
  14. Calvert, Jane. 2006. What’s Special about Basic Research? Science, Technology & Human Values 31: 199–220. doi: 10.1177/0162243905283642.CrossRefGoogle Scholar
  15. Chalmers, Iain, M.B. Bracken, and Ben Djulbegovic. 2014. How to increase value and reduce waste when research priorities are set. The Lancet: 7–16.Google Scholar
  16. Chien, Chen–Fu. 2002. A portfolio–evaluation framework for selecting R&D projects. R&D Management 32: 359–368. doi: 10.1111/1467-9310.00266.CrossRefGoogle Scholar
  17. Cozzens, Susan. 1997. The knowledge pool: Measurement challenges in evaluating fundamental research programs. Evaluation and Program Planning 20: 77–89. doi: 10.1016/S0149-7189(96)00038-9.CrossRefGoogle Scholar
  18. Cozzens, Susan, and Michelle Snoeck. 2010. Knowledge to Policy Contributing to the Measurement of Social, Health, and Environmental Benefits. Paper prepared for the Workshop on the Science of Science Measurement: 1–39.Google Scholar
  19. Dasgupta, Partha, and Eric Maskin. 2012. The Simple Economics of Research Portfolios. The Economic Journal 97: 581–595.CrossRefGoogle Scholar
  20. Devinney, Timothy M., and David W. Stewart. 1988. Rethinking the Product Portfolio: A Generalized Investment Model. Management Science 34: 1080–1095. doi: 10.1287/mnsc.34.9.1080.CrossRefGoogle Scholar
  21. Dietz, James S., and Juan D. Rogers. 2012. Meanings and policy implications of “transformative research”: Frontiers, hot science, evolution, and investment risk. Minerva 50(1): 21–44. doi: 10.1007/s11024-012-9190-x.CrossRefGoogle Scholar
  22. Dolby, Kevin, Jimmy Whitworth, Marta Tufet, Suzi Morris, Jessica Burnett, Lily Ickowitz-Seidler, Annie Sanderson, Dave Carr, and Jo Scott. 2012. Malaria 1990–2009. London: Wellcome Trust.Google Scholar
  23. Eikenberry, Angela M., and Jodie Drapal Kluver. 2004. The Marketization of the Nonprofit Sector: Civil Society at Risk. Public Administration Review 64: 132–140.CrossRefGoogle Scholar
  24. Ely, Adrian, Patrick Van Zwanenberg, and Andrew Stirling. 2014. Broadening out and opening up technology assessment: Approaches to enhance international development, co-ordination and democratisation. Research Policy 43: 505–518. doi: 10.1016/j.respol.2013.09.004.CrossRefGoogle Scholar
  25. Ernst, Holger. 1998. Patent portfolios for strategic R&D planning. Journal of Engineering and Technology Management 15: 279–308. doi: 10.1016/S0923-4748(98)00018-6.CrossRefGoogle Scholar
  26. European Commission. 2005. Impact assessment and ex ante evaluation. Brussels.Google Scholar
  27. Evans, James A., Jae-Mahn Shim, and John P. Ioannidis. 2014. Attention to local health burden and the global disparity of health research. PloS One 9: e90147. doi: 10.1371/journal.pone.0090147.CrossRefGoogle Scholar
  28. Feller, Irwin. 2012. Performance measures as forms of evidence for science and technology policy decisions. The Journal of Technology Transfer 38: 565–576. doi: 10.1007/s10961-012-9264-9.CrossRefGoogle Scholar
  29. Fernandez, Eduardo, Edy Lopez, Gustavo Mazcorro, Rafael Olmedo, and Carlos Coello Coello. 2013. Application of the non-outranked sorting genetic algorithm to public project portfolio selection. Information Sciences 228: 131–149. doi: 10.1016/j.ins.2012.11.018.CrossRefGoogle Scholar
  30. Fisher, Erik, Catherine P. Slade, Derrick Anderson, and Barry Bozeman. 2010. The public value of nanotechnology? Scientometrics 85: 29–39. doi: 10.1007/s11192-010-0237-1.CrossRefGoogle Scholar
  31. Foray, Dominique, David C. Mowery, and Richard R. Nelson. 2012. Public R&D and social challenges: What lessons from mission R&D programs? Research Policy 41: 1697–1702. doi: 10.1016/j.respol.2012.07.011.CrossRefGoogle Scholar
  32. Freeman, Christopher. 1991. Innovation, Changes of Techno-Economic Paradigm and Biological Analogies in Economics. Revue économique 42: 211. doi: 10.2307/3502005.Google Scholar
  33. Frodeman, Robert, and Adam Briggle. 2012. The dedisciplining of peer review. Minerva 50(1): 3–19. doi: 10.1007/s11024-012-9192-8.CrossRefGoogle Scholar
  34. Garfinkel, Michele S., Daniel Sarewitz, and Alan L. Porter. 2006. A societal outcomes map for health research and policy. American Journal of Public Health 96: 441–446. doi: 10.2105/AJPH.2005.063495.CrossRefGoogle Scholar
  35. Geels, Frank W. 2004. From sectoral systems of innovation to socio-technical systems. Research Policy 33: 897–920. doi: 10.1016/j.respol.2004.01.015.CrossRefGoogle Scholar
  36. Georghiou, Luke. 1998. Issues in the Evaluation of Innovation and Technology Policy. Evaluation 4: 37–51. doi: 10.1177/13563899822208374.CrossRefGoogle Scholar
  37. Ghiselin, Michael T. 1978. The Economy of the Body. The American Economic Review 68: 233–237.Google Scholar
  38. Gläser, Jochen. 2012. Jochen Gläser on the possibility of a sociological middle-range theory linking science. TUTS-WP-1-2012. Technical University Technology Studies Working Papers. Berlin.Google Scholar
  39. Golec, Joseph H. 1996. The effects of mutual fund managers’ characteristics on their portfolio performance, risk and fees. Financial Services Review 5: 133–147. doi: 10.1016/S1057-0810(96)90006-2.CrossRefGoogle Scholar
  40. Guthrie, Susan, Benoit Guerin, Helen Wu, Sharif Ismail, and Steven Wooding. 2013. Alternatives to Peer Review in Research Project Funding.Google Scholar
  41. Haak, Laurel L., Will Ferriss, Kevin Wright, Michael E. Pollard, Kirk Barden, Matt A. Probus, Michael Tartakovsky, and Charles J. Hackett. 2012. The electronic Scientific Portfolio Assistant: Integrating scientific knowledge databases to support program impact assessment. Science and Public Policy 39: 464–475. doi: 10.1093/scipol/scs030.CrossRefGoogle Scholar
  42. Hage, Jerald, Gretchen B. Jordan, and Jonathan Mote. 2007. A theory-based innovation systems framework for evaluating diverse portfolios of research, part two: Macro indicators and policy interventions. Science and Public Policy 34: 731–741. doi: 10.3152/030234207X265385.CrossRefGoogle Scholar
  43. Hammerstein, Peter, and Edward H. Hagen. 2005. The second wave of evolutionary economics in biology. Trends in Ecology & Evolution 20: 604–609. doi: 10.1016/j.tree.2005.07.012.CrossRefGoogle Scholar
  44. Hanney, Stephen R. 2003. The utilisation of health research in policy-making: Concepts, examples and methods of assessment. Health Research 28: 1–28.Google Scholar
  45. Hanney, Stephen R., Iain Frame, Jonathan Grant, Martin Buxton, Tracey Young, and Grant Lewison. 2005. Using categorisations of citations when assessing the outcomes from health research. Scientometrics 65: 357–379.CrossRefGoogle Scholar
  46. Hausmann, Ricardo, César A. Hidalgo, Sebastián Bustos, Michele Coscia, Alexander Simoes, and Muhammed A. Yildirim. 2013. The atlas of economic complexity: Mapping paths to prosperity. Cambridge: Massachusetts Institute of Technology and Centre for International Development, Harvard University.Google Scholar
  47. Hicks, Diana. 2014. “What are grand challenges?” The selected works of Diana Hicks. http://works.bepress.com/diana_hicks/38. (unpublished).
  48. Holbrook, J. Britt, and Robert Frodeman. 2011. Peer review and the ex ante assessment of societal impacts. Research Evaluation 20: 239–246. doi: 10.3152/095820211X12941371876788.
  49. Ioannidis, John P. 2011. Fund people not projects. Nature 477: 529–531.CrossRefGoogle Scholar
  50. Ioannidis, John P. 2014. How to Make More Published Research True. PLoS Medicine 11: e1001747. doi: 10.1371/journal.pmed.1001747.CrossRefGoogle Scholar
  51. Ismail, Sharif, Jan Tiessen, and Steven Wooding. 2010. Strengthening Research Portfolio Evaluation at the Medical Research Council.Google Scholar
  52. Jordan, Gretchen B., Jerald Hage, and Jonathon Mote. 2008. A theories-based systemic framework for evaluating diverse portfolios of scientific work, part 1: Micro and meso indicators. New Directions for Evaluation 2008: 7–24.CrossRefGoogle Scholar
  53. Kay, Luciano, Nils Newman, Jan Youtie, Alan L. Porter, and Ismael Rafols. 2014. Patent Overlay Mapping: Visualizing Technological Distance. Journal of the American Society for Information Science and Technology 65: 2432–2443. doi: 10.1002/asi.23146.CrossRefGoogle Scholar
  54. Kuehn, Bridget M. 2012. US Reviews High-Risk Research Portfolio. JAMA 307: 1682.CrossRefGoogle Scholar
  55. Kuhn, Thomas S. 1979. Metaphor in science. In Metaphor and Thought, ed. A. Ortony. Cambridge: Cambridge University Press.Google Scholar
  56. Lakoff, George, and Mark Johnson. 1980. Metaphors we live by. Chicago: University of Chicago Press.Google Scholar
  57. Largent, Mark A., and Julia Lane. 2012. STAR METRICS and the Science of Science Policy. Review of Policy Research 29: 431–438. doi: 10.1111/j.1541-1338.2012.00567.x.CrossRefGoogle Scholar
  58. Laudel, Grit, and Jochen Gläser. 2014. Beyond breakthrough research: Epistemic properties of research and their consequences for research funding. Research Policy 43: 1204–1216. doi: 10.1016/j.respol.2014.02.006.CrossRefGoogle Scholar
  59. Liggins, Charlene, Lisa Pryor, and Marie A. Bernard. 2010. Challenges and Opportunities in Advancing Models of Care for Older Adults: An Assessment of the National Institute on Aging Research Portfolio. Journal of the American Geriatrics Society 58: 2345–2349. doi: 10.1111/j.1532-5415.2010.03157.x.CrossRefGoogle Scholar
  60. Linton, Jonathan D., Steven T. Walsh, and Joseph Morabito. 2002. Analysis, ranking and selection of R&D projects in a portfolio. R&D Management 32: 139–148. doi: 10.1111/1467-9310.00246.CrossRefGoogle Scholar
  61. Luo, Lieh-Ming. 2011. Optimal diversification for R&D project portfolios. Scientometrics 91: 219–229. doi: 10.1007/s11192-011-0537-0.CrossRefGoogle Scholar
  62. Marburger, John. 2005. Presentation to the Annual Meeting of the Consortium of Social Science Associations (Washington, D.C.).Google Scholar
  63. Markowitz, Harry. 1952. Portfolio Selection. Journal of Finance 7: 77–91.Google Scholar
  64. Marres, Noortje, and Esther Weltevrede. 2013. SCRAPING THE SOCIAL? Journal of Cultural Economy 6:313–335. doi: 10.1080/17530350.2013.772070.
  65. Martin, Ben R. 2011. The Research Excellence Framework and the “impact agenda”: Are we creating a Frankenstein monster? Research Evaluation 20: 247–254. doi: 10.3152/095820211X13118583635693.CrossRefGoogle Scholar
  66. McGeary, Michael, and Philip M. Smith. 1996. The R&D portfolio: A concept for allocating science and technology funds. Science 274: 1484–1485.CrossRefGoogle Scholar
  67. Meador, Kimford J., Jacqueline French, David W. Loring, and Page B. Pennell. 2011. Disparities in NIH funding for epilepsy research. Neurology 77: 1305–1307.CrossRefGoogle Scholar
  68. Mirowski, Philip. 1991. More Heat than Light: Economics as Social Physics, Physics as Nature’s Economics. Cambridge: Cambridge University Press.Google Scholar
  69. Molas-Gallart, Jordi, and Puay Tang. 2011. Tracing “productive interactions” to identify social impacts: An example from the social sciences. Research Evaluation 20: 219–226. doi: 10.3152/095820211X12941371876706.CrossRefGoogle Scholar
  70. Moravcsik, Michael J. 1984. Life in a multidimensional world. Scientometrics 6: 75–85. doi: 10.1007/BF02021280.CrossRefGoogle Scholar
  71. Moravcsik, Michael J. 1988. The limits of science and the scientific method. Research Policy 17: 293–299.CrossRefGoogle Scholar
  72. Mowery, David C. 2012. Defense-related R&D as a model for “Grand Challenges” technology policies. Research Policy 41: 1703–1715. doi: 10.1016/j.respol.2012.03.027.CrossRefGoogle Scholar
  73. National Research Council. 2005. A prospective evaluation of applied energy research and development at DOE (Phase One). Washington.Google Scholar
  74. National Research Council. 2012. A Review of NASA Human Research Program’s Scientific Merit Processes: Letter Report. Washington: The National Academies Press.Google Scholar
  75. National Science Board. 2001. Federal Research Resources: A Process for Setting Priorities. National Science Foundation.Google Scholar
  76. Nicholson, Joshua M., and John P. Ioannidis. 2012. Research grants: Conform and be funded. Nature 492: 34–36.Google Scholar
  77. Perlitz, Manfred, Thorsten Peske, and Randolf Schrank. 1999. Real options valuation: The new frontier in R&D project evaluation? R&D Management 29: 255–270. doi: 10.1111/1467-9310.00135.CrossRefGoogle Scholar
  78. Porter, Theodore M. 1995. Trust in numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.Google Scholar
  79. Portfolio Review Group. 2014. Report of the Portfolio Review Group: 2012-2013 University of California Systemwide Research Portfolio: Cycle 1 Programs Findings and Recommendations. University of California.Google Scholar
  80. Rafols, Ismael, Alan L. Porter, and Loet Leydesdorff. 2010. Science overlay maps: A new tool for research policy and library management. Journal of the American Society for Information Science and Technology 61: 1871–1887. doi: 10.1002/asi.21368.CrossRefGoogle Scholar
  81. Rafols, Ismael, Loet Leydesdorff, Alice O’Hare, Paul Nightingale, and Andy Stirling. 2012. How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management. Research Policy 41: 1262–1282. doi: 10.1016/j.respol.2012.03.015.CrossRefGoogle Scholar
  82. Reid, W.V., D. Chen, L. Goldfarb, H. Hackmann, Y.T. Lee, K. Mokhele, E. Ostrom, K. Raivio, H.J. Schellnhuber, and A. Whyte. 2010. Earth system science for global sustainability: Grand challenges. Science 330: 916–917.CrossRefGoogle Scholar
  83. Robertson, G. Philip, Vivien G. Allen, George Boody, Emery R. Boose, Nancy G. Creamer, E. Laurie, James R. Gosz, et al. 2008. Long-term Agricultural Research: A research, education, and extension imperative. BioScience 58: 640–645.Google Scholar
  84. Røttingen, John-Arne, Sadie Regmi, Mari Eide, Alison J. Young, Roderik F. Viergever, Christine Ardal, Javier Guzman, Danny Edwards, Stephen Matlin, and Robert F. Terry. 2013. Mapping of available health research and development data: What’s there, what’s missing, and what role is there for a global observatory? Lancet 382: 1286–1307. doi: 10.1016/S0140-6736(13)61046-6.CrossRefGoogle Scholar
  85. Ruegg, Rosalie T. 2007. Quantitative portfolio evaluation of US federal research and development programs. Science and Public Policy 34: 723–730. doi: 10.3152/030234207X259021.CrossRefGoogle Scholar
  86. Salter, Ammon J., and Ben R. Martin. 2001. The economic benefits of publicly funded basic research: A critical review. Research Policy 30: 509–532. doi: 10.1016/S0048-7333(00)00091-3.CrossRefGoogle Scholar
  87. Sarewitz, Daniel. 1996. Frontiers of Illusion: Science, Technology and the Politics of Progress. Philadelphia: Temple University Press.Google Scholar
  88. Sarewitz, Daniel, and Roger A. Pielke Jr. 2007. The neglected heart of science policy: Reconciling supply of and demand for science. Environmental Science & Policy 10: 5–16. doi: 10.1016/j.envsci.2006.10.001.CrossRefGoogle Scholar
  89. Schwenk, Charles R. 1988. The Cognitive Perspective on Strategic Decision Making. Journal of Management Studies 25: 41–55. doi: 10.1111/j.1467-6486.1988.tb00021.x.CrossRefGoogle Scholar
  90. Scientific Management Review Board. 2013. Draft Report on Approaches to Assess the Value of Biomedical Research Supported by NIH. National Institutes of Health.Google Scholar
  91. Skupin, André, Joseph R. Biberstine, and Katy Börner. 2013. Visualizing the topical structure of the medical sciences: A self-organizing map approach. PloS One 8: e58779. doi: 10.1371/journal.pone.0058779.CrossRefGoogle Scholar
  92. Smith, Richard. 1988. Peering into the bowels of the MRC. I: Setting priorities. British Medical Journal (Clinical research ed.) 296: 484–488.CrossRefGoogle Scholar
  93. Smith, Keith. 2000. Innovation as a Systemic Phenomenon: Rethinking the Role of Policy. Enterprise and Innovation Management Studies 1: 73–102. doi: 10.1080/146324400363536.CrossRefGoogle Scholar
  94. Snellen, Ignatius Th.M. 1983. Social Merit as a Criterion of Scientific Choice: Its Application in Dutch Science Policy. Minerva 21: 16–36.CrossRefGoogle Scholar
  95. Souder, William E., and Tomislav Mandakovic. 1986. R&D Project Selection Models. Research Management 29: 36–42.Google Scholar
  96. Spaapen, Jack, and Leonie van Drooge. 2011. Introducing “productive interactions” in social impact assessment. Research Evaluation 20: 211–218. doi: 10.3152/095820211X12941371876742.CrossRefGoogle Scholar
  97. Sponberg, Adrienne F. 2005. Streamlining the federal water research portfolio. BioScience 55.Google Scholar
  98. Srivastava, Christina Viola, Nathaniel Deshmukh Towery, and Brian Zuckerman. 2007. Challenges and opportunities for research portfolio analysis, management, and evaluation. Research Evaluation 16: 152–156. doi: 10.3152/095820207X236385.CrossRefGoogle Scholar
  99. Stilgoe, Jack. 2014. Against excellence. The Guardian, December 19. http://www.theguardian.com/science/political-science/2014/dec/19/against-excellence.
  100. Stirling, Andy. 2007. A general framework for analysing diversity in science, technology and society. Journal of the Royal Society Interface 4: 707–719. doi: 10.1098/rsif.2007.0213.CrossRefGoogle Scholar
  101. Stirling, Andy, and Ian Scoones. 2009. From Risk Assessment to Knowledge Mapping: Science, Precaution, and Participation in Disease Ecology. Ecology and Society 14: 14.Google Scholar
  102. Stummer, Christian, and Kurt Heidenberger. 2003. Interactive R&D Portfolio Analysis With Project Interdependencies and Time Profiles of Multiple Objectives. IEEE Transactions on Engineering Management 50(2): 175–183.CrossRefGoogle Scholar
  103. Swedish Presidency of the European Union. 2009. The Lund Declaration. European Union.Google Scholar
  104. Van Bekkum, Sjoerd, Enrico Pennings, and Han Smit. 2009. A real options perspective on R&D portfolio diversification. Research Policy 38: 1150–1158.CrossRefGoogle Scholar
  105. Vonortas, Nicholas S., and Chintal A. Desai. 2007. “Real options” framework to assess public research investments. Science and Public Policy 34: 699–708. doi: 10.3152/030234207X259012.CrossRefGoogle Scholar
  106. Waltman, Ludo, and Nees Jan van Eck. 2012. A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology 63: 2378–2392.CrossRefGoogle Scholar
  107. Waltman, Ludo, Nees Jan Van Eck, and Ed C. M. Noyons. 2009. A unified approach to mapping and clustering of bibliometric networks: 1–11.Google Scholar
  108. Weinberg, Alvin M. 1963. The Criteria for Scientific Choice. Minerva 1: 159–171.CrossRefGoogle Scholar
  109. Woolf, Steven H. 2008. The meaning of translational research and why it matters. JAMA 299: 211–213. doi: 10.1001/jama.2007.26.Google Scholar
  110. Wulf, William A. 1998. Balancing the research portfolio. Science (New York, N.Y.) 281: 1803.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.INGENIO (CSIC-UPV)Universitat Politècnica de ValènciaValenciaSpain
  2. 2.SPRUUniversity of SussexBrightonUK
  3. 3.Observatoire des Sciences et Techniques (HCERES-OST)ParisFrance

Personalised recommendations