The ranking of scientific journals is important because of the signal it sends to scientists about what is considered most vital for scientific progress. Existing ranking systems focus on measuring the influence of a scientific paper (citations)—these rankings do not reward journals for publishing innovative work that builds on new ideas. We propose an alternative ranking based on the proclivity of journals to publish papers that build on new ideas, and we implement this ranking via a text-based analysis of all published biomedical papers dating back to 1946. In addition, we compare our neophilia ranking to citation-based (impact factor) rankings; this comparison shows that the two ranking approaches are distinct. Prior theoretical work suggests an active role for our neophilia index in science policy. Absent an explicit incentive to pursue novel science, scientists underinvest in innovative work because of a coordination problem: for work on a new idea to flourish, many scientists must decide to adopt it in their work. Rankings that are based purely on influence thus do not provide sufficient incentives for publishing innovative work. By contrast, adoption of the neophilia index as part of journal-ranking procedures by funding agencies and university administrators would provide an explicit incentive for journals to publish innovative work and thus help solve the coordination problem by increasing scientists’ incentives to pursue innovative work.
KeywordsNovel science Novelty Journal rankings Citations Impact factor Text analysis
We thank Bruce Weinberg, Vetla Torvik, Neil Smalheiser, Partha Bhattacharyya, Walter Schaeffer, Katy Borner, Robert Kaestner, Donna Ginther, Joel Blit and Joseph De Juan for comments. We also thank seminar participants at the University of Illinois at Chicago Institute of Government and Public Affairs, at the Research in Progress Seminar at Stanford Medical School, and at the National Bureau of Economic Research working group on Invention in an Aging Society for helpful feedback. Finally, we thank the National Institute of Aging for funding for this research through grant P01-AG039347. We are solely responsible for the content and errors in the paper.
- Engemann, K. M., & Wall, H. J. (2009). A journal ranking for the ambitious economist. Federal Reserve Bank of St. Louis Review, 91(3), 127–139.Google Scholar
- Frey, B., & Katja, R. (2010). Do rankings reflect research quality? Journal of Applied Science, 13(1), 1–38.Google Scholar
- Hutchins, B. I., Yuan, X., Anderson, J. M., & Santangelo, G. M. (2015). Relative Citation Ratio (RCR): A new metric that uses citation rates to measure influence at the article level; BioRxiv pre-print.Google Scholar
- Katerattanakul, P., Razi, M. A., Han, B. T., & Kam, H.-J. (2005). Consistency and concern on IS journal rankings. Journal of Information Technology Theory and Application (JITTA), 7(2), 1–20.Google Scholar
- Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: Chicago University Press.Google Scholar
- Marshall, A. (1920). Principles of economics (8th ed.). London: Macmillan and Co.Google Scholar
- Morris, S., & Shin, H. S. (2003). Global games: Theory and applications. In M. Dewatripont, L. Hansen, & S. Turnovsky (Eds.), Advances in economics and econometrics. Cambridge: Cambridge University Press.Google Scholar
- Packalen, M., & Bhattacharya, J. (2015a). Age and the trying out of new ideas, NBER working paper no. 20920.Google Scholar
- Packalen, M., & Bhattacharya, J. (2015b). New ideas in invention, NBER working paper no. 20922.Google Scholar
- Packalen, M., & Bhattacharya, J. (2015c). Cities and ideas, NBER working paper no. 20921.Google Scholar
- Wang, J., Veugelers, R., & Stephan, P. (2015). Bias against novelty in science: A cautionary tale for users of bibliometric indicators; working paper.Google Scholar
- Xu, R., Musen, M. A., & Shah, N. (2010). A compehensive analysis of five million UMLS metathesaurus terms using eighteen million MEDLINE citations. In AMIA annual symposium proceedings, pp. 907–911.Google Scholar