Scientometrics

, Volume 88, Issue 3, pp 979–1001 | Cite as

Measuring economic journals’ citation efficiency: a data envelopment analysis approach

Article

Abstract

This paper by using data envelopment analysis (DEA) and statistical inference evaluates the citation performance of 229 economic journals. The paper categorizes the journals into four main categories (A–D) based on their efficiency levels. The results are then compared to the 27 “core economic journals” as introduced by Diamond (Curr Contents 21(1):4–11, 1989). The results reveal that after more than 20 years Diamonds’ list of “core economic journals” is still valid. Finally, for the first time the paper uses data from four well-known databases (SSCI, Scopus, RePEc, Econlit) and two quality ranking reports (Kiel Institute internals ranking and ABS quality ranking report) in a DEA setting and in order to derive the ranking of 229 economic journals. The ten economic journals with the highest citation performance are Journal of Political Economy, Econometrica, Quarterly Journal of Economics, Journal of Financial Economics, Journal of Economic Literature, American Economic Review, Review of Economic Studies, Journal of Econometrics, Journal of Finance, Brookings Papers on Economic Activity.

Keywords

Ranking journals Economic journals Data envelopment analysis Indexing techniques 

Mathematics Subject Classification (2000)

46N10 62F07 62G09 

Notes

Acknowledgments

We would like to thank Professor Tibor Braun and the anonymous reviewers for the comments and suggestions made in an earlier version of our paper. Finally, we would like to thank Panayiotis Tzeremes for his assistance in collecting journals’ information. Any remaining errors are solely the authors’ responsibility.

References

  1. Bakkalbasi, N., Bauer, K., Glover, J., & Wang, L. (2006). Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomedical Digital Libraries, 3(7), 1–8.Google Scholar
  2. Banker, R. D., Charnes, A., & Cooper, W. W. (1984). Some models for estimating technical and scale inefficiencies in data envelopment analysis. Management Science, 30(9), 1078–1092.MATHCrossRefGoogle Scholar
  3. Bar-Ilan, J. (2010). Citations to the “Introduction to infometrics” indexed by WoS, Scopus and Google Scholar. Scientometrics, 82(3), 495–506.CrossRefGoogle Scholar
  4. Bauer, K., & Bakalbassi, N. (2005). An examination of citation counts in a new scholarly communication environment. D-Lib Magazine, 11(9). http://www.dlib.org/dlib/september05/bauer/09bauer.html.
  5. Boles, J. N. (1967). Efficiency squared—Efficient computation of efficiency indexes. In Western Farm Economic Association Proceedings 1966 (pp. 137–142).Google Scholar
  6. Boles, J. N. (1971). The 1130 Farrell efficiency system—Multiple products, multiple factors. Berkeley: Giannini Foundation of Agricultural Economics, University of California.Google Scholar
  7. Bollen, J., & Van de Sompel, H. (2008). Usage Impact Factor: The effects of sample characteristics on usage-based impact metrics. Journal of the American Society for Information Science and Technology, 59(1), 136–149.CrossRefGoogle Scholar
  8. Bonaccorsi, A., & Daraio, C. (2008). The differentiation of the strategic profile of higher education institutions. New positioning indicators based on microdata. Scientometrics, 74(1), 15–37.CrossRefGoogle Scholar
  9. Bonaccorsi, A., Daraio, C., & Simar, L. (2006). Advanced indicators of productivity of universities. An application of robust nonparametric methods to Italian data. Scientometrics, 66(2), 389–410.CrossRefGoogle Scholar
  10. Burton, M. P., & Phimister, E. (1995). Core journals: A reappraisal of the Diamond list. Economic Journal, 105(429), 361–373.CrossRefGoogle Scholar
  11. Charnes, A., Cooper, W. W., & Rhodes, E. L. (1978). Measuring the efficiency of decision making units. European Journal of Operational Research, 2(6), 429–444.MathSciNetMATHCrossRefGoogle Scholar
  12. Coelli, T. J., & Perelman, S. (1999). A comparison of parametric and non-parametric distance functions: With applications to European railways. European Journal of Operational Research, 117(2), 326–339.MATHCrossRefGoogle Scholar
  13. Coelli, T. J., Rap, D. S. P., O’Donnell, C. J., & Battese, G. E. (2005). An introduction to efficiency and productivity analysis (2nd ed.). New York: Springer Science.Google Scholar
  14. Cook, W. D., Golany, B., Penn, M., & Ravin, T. (2007). Creating a consensus ranking of proposals from reviewers’ partial ordinal rankings. Computers & Operations Research, 34(4), 954–965.MATHCrossRefGoogle Scholar
  15. Cook, W. D., Ravin, T., & Richardson, A. J. (2010). Aggregating incomplete lists of journal rankings: An application to academic accounting journals. Accounting Perspectives, 9(3), 217–235.CrossRefGoogle Scholar
  16. Debreu, G. (1951). The coefficient of resource utilization. Econometrics, 19(3), 273–292.MATHCrossRefGoogle Scholar
  17. Diamond, A. M. (1989). The core journals of economics. Current Contents, 21(1), 4–11.Google Scholar
  18. Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Annals of Statistics, 7(1), 1–16.MathSciNetMATHCrossRefGoogle Scholar
  19. Etxebarria, G., & Gomez-Uranga, M. (2010). Use of Scopus and Google Scholar to measure social sciences production in four major Spanish universities. Scientometrics, 82(2), 333–349.CrossRefGoogle Scholar
  20. Farrell, M. (1957). The measurement of productive efficiency. Journal of the Royal Statistical Society Series A, 120(3), 253–281.CrossRefGoogle Scholar
  21. Førsund, F. R., Kittelsen, S. A. C., & Krivonozhko, V. E. (2009). Farrell revisited—Visualizing properties of DEA production frontiers. Journal of the Operational Research Society, 60(11), 1535–1545.CrossRefGoogle Scholar
  22. Førsund, F. R., & Sarafoglou, N. (2002). On the origins of data envelopment analysis. Journal of the Productivity Analysis, 17(1/2), 23–40.CrossRefGoogle Scholar
  23. Franceschet, M. (2010). A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometics, 83(1), 243–258.CrossRefGoogle Scholar
  24. Garfield, E. (1955). Citation indexes to science: A new dimension in documentation through association of ideas. Science, 122(3159), 108–111.CrossRefGoogle Scholar
  25. Garfield, E. (1979). Citation indexing: Its theory and applications in science, technology and humanities. New York: Wiley Interscience.Google Scholar
  26. Garfield, E. (2005). The agony and the ecstasy—The history and meaning of the journal Impact Factor. In International congress on peer review and biomedical publication, Chicago.Google Scholar
  27. Glanzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171–193.CrossRefGoogle Scholar
  28. Halkos, G., & Tzeremes, N. (2007). International competitiveness in the ICT industry: Evaluating the performance of the top 50 companies. Global Economic Review, 36(2), 167–168.CrossRefGoogle Scholar
  29. Halkos, G., & Tzeremes, N. (2010). The effect of foreign ownership on SMEs performance: An efficiency analysis perspective. Journal of Productivity Analysis, 34(2), 167–180.CrossRefGoogle Scholar
  30. Harvey, C., Kelly, A., Morris, H., & Rowlinson, M. (2010). Academic journal quality guide, Version 4. The Association of Business Schools.Google Scholar
  31. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102, 16569–16572.CrossRefGoogle Scholar
  32. Hoffman, A. J. (1957). Discussion on Mr. Farrell’s Paper. Journal of the Royal Statistical Society Series A, 120(III), 284.Google Scholar
  33. Kalaitzidakis, P., Mamuneas, T. P., & Stengos, T. (2003). Rankings of academic journals and institutions in economics. Journal of the European Economic Association, 1(6), 1346–1366.CrossRefGoogle Scholar
  34. Kalaitzidakis, P., Mamuneas T. P., & Stengos, T. (2010). An updated ranking of academic journals in economics, WP 10-15. The Rimini Centre for Economic Analysis.Google Scholar
  35. Kiel (2010). Criteria for research publications. Kiel Institute for World Economy. http://www.ifw-kiel.de/academy/criteria-for-research-publications.
  36. Klavans, R., & Boyack, K. (2009). Toward a consensus map of science. Journal of the American Society for information science and technology, 60(3), 455–476.CrossRefGoogle Scholar
  37. Koczy, L. A., & Strobel, M. (2007). The ranking of economics journals by a tournament method. Mimeo.Google Scholar
  38. Kodrzycki, Y. K., & Yu, P. (2006). New approaches to ranking economic journals. Contributions to Economic Analysis and Policy, 5(1), Art. 24.Google Scholar
  39. Koopmans, T. C. (1951). An analysis of production as an efficient combination of activities. In T. C. Koopmans (Ed.), Activity analysis of production and allocation (pp. 33–97). New York: Wiley.Google Scholar
  40. Kousha, K., & Thelwall, M. (2008). Sources of Google Scholar citations outside the Science Citation Index: A comparison between four science disciplines. Scientometrics, 74(2), 273–294.CrossRefGoogle Scholar
  41. Laband, D. N., & Piette, M. J. (1994). The relative impacts of economics journals: 1970–1990. Journal of Economic Literature, 32(2), 640–666.Google Scholar
  42. Leydesdorff, L., de Moya-Anegon, F., & Guerrero-Bote, V. P. (2010). Journal maps on the basis of Scopus data: A comparison with Journal Citation Reports of the ISI. Journal of the American Society for Information Science and Technology, 61(2), 352–369.Google Scholar
  43. Liebowitz, S. J., & Palmer, J. C. (1984). Assessing the relative impacts of economics journals. Journal of Economic Literature, 22, 77–88.Google Scholar
  44. Liner, G. H., & Amin, M. (2004). Methods of ranking economic journals. Atlantic Economic Journal, 32(2), 140–149.CrossRefGoogle Scholar
  45. Lopez-Illescas, C., de Moya-Anegon, F., & Moed, H. F. (2008). Coverage and citation impact of oncological journals in the Web of Science and Scopus. Journal of Infometrics, 2, 304–316.CrossRefGoogle Scholar
  46. Lovell, C. A. L., & Schmidt, P. (1988). A comparison of alternative approaches to the measurement of productive efficiency. In A. Dogramaci & R. Färe (Eds.), Applications of modern production theory: Efficiency and productivity. Boston: Kluwer.Google Scholar
  47. Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and ranking of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.CrossRefGoogle Scholar
  48. Moed, H. F. (2010). Measuring contextual citation impact of scientific journals. Journal of Infometrics. doi: 10.1016/j.joi.2010.01.002.
  49. Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Infometrics, 1(2), 161–169.CrossRefGoogle Scholar
  50. Noruzi, A. (2005). Google Scholar: The new generation of citation indexes. Libri, 55(4), 170–180.CrossRefGoogle Scholar
  51. Palacio-Huerta, I., & Volij, O. (2004). The measurement of intellectual influence. Econometrica, 72(3), 963–977.CrossRefGoogle Scholar
  52. Pinski, G., & Narin, F. (1976). Citation influence for journal aggregates of scientific publications: Theory, with application to the literature of Physics. Information Processing & Management, 12(5), 297–312.CrossRefGoogle Scholar
  53. Pudovkin, A. I., & Garfield, E. (2004). Rank-normalized Impact Factor: A way to compare journal performance across subject categories. In Proceedings of the 67th ASIS&T annual meeting, 17, November, 2004. http://www.garfield.library.upenn.edu/papers/asistranknormalization2004.pdf.
  54. Pujol, F. (2008). Ranking journals following a matching model approach: An application to public economic journals. Journal of Public Economic Theory, 10(1), 55–76.CrossRefGoogle Scholar
  55. Rainer, K. R., & Miller, M. D. (2005). Examining differences across journal rankings. Communications of the ACM, 48(2), 91–94.CrossRefGoogle Scholar
  56. Ritzberger, K. (2008). A ranking of journals in economics and related fields. German Economic Review, 9(4), 402–430.CrossRefGoogle Scholar
  57. Schneider, F., & Ursprung, H. W. (2008). The 2008 GEA journal-ranking for the economics profession. German Economic Review, 9(4), 532–538.CrossRefGoogle Scholar
  58. Shephard, R. W. (1970). Theory of cost and production function. Princeton: Princeton University Press.Google Scholar
  59. Simar, L., & Wilson, P. W. (1998). Sensitivity analysis of efficiency scores: How to bootstrap in non parametric frontier models. Management Science, 44(1), 49–61.MATHCrossRefGoogle Scholar
  60. Simar, L., & Wilson, P. W. (2000). A general methodology for bootstrapping in non-parametric frontier models. Journal of Applied Statistics, 27(6), 779–802.MathSciNetMATHCrossRefGoogle Scholar
  61. Simar, L., & Wilson, P. W. (2002). Non parametric tests of return to scale. European Journal of Operational Research, 139(1), 115–132.MathSciNetMATHCrossRefGoogle Scholar
  62. Simar, L., & Wilson, P. (2008). Statistical interference in nonparametric frontier models: Recent developments and perspectives. In H. Fried, C. A. K. Lovell, & S. Schmidt (Eds.), The measurement of productive efficiency and productivity change (pp. 421–521). New York: Oxford University Press.Google Scholar
  63. Theussl, S., & Hornik, K. (2009). Journal ratings and their consensus ranking. In Operations research proceedings 2008. Berlin: Springer-Verlag. doi: 10.1007/978-3-642-00142-0_65.
  64. Zitt, M., & Small, H. (2008). Modifying the journal Impact Factor by fractional citation weighting: The Audience Factor. Journal of the American Society for Information Science and Technology, 59(11), 1856–1860.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2011

Authors and Affiliations

  1. 1.Department of EconomicsUniversity of ThessalyVolosGreece

Personalised recommendations