Advertisement

Annals of Operations Research

, Volume 223, Issue 1, pp 95–108 | Cite as

Influential DMUs and outlier detection in data envelopment analysis with an application to health care

  • Ali Reza Bahari
  • Ali Emrouznejad
Article

Abstract

This paper explains some drawbacks on previous approaches for detecting influential observations in deterministic nonparametric data envelopment analysis models as developed by Yang et al. (Annals of Operations Research 173:89–103, 2010). For example efficiency scores and relative entropies obtained in this model are unimportant to outlier detection and the empirical distribution of all estimated relative entropies is not a Monte-Carlo approximation. In this paper we developed a new method to detect whether a specific DMU is truly influential and a statistical test has been applied to determine the significance level. An application for measuring efficiency of hospitals is used to show the superiority of this method that leads to significant advancements in outlier detection.

Keywords

Data envelopment analysis (DEA) Bootstrapping Outlier detection Influential DMU Hospital efficiency 

References

  1. Anderson, P., & Peterson, N. C. (1993). A procedure for ranking efficient units in data envelopment analysis. Management Science, 39(10), 1261–1264.CrossRefGoogle Scholar
  2. Banker, R. D., & Chang, H. (2006). The super-efficiency procedure for outlier identification, not for ranking efficient units. European Journal of Operational Research, 175(2), 1311–1320.CrossRefGoogle Scholar
  3. Bilsel, M., & Davutyan, N. (2011). Hospital efficiency with risk adjusted mortality as undesirable output: The Turkish case. Annals of Operations Research, 1–16.Google Scholar
  4. Chang, S. J., Hsiao, H. C., Huang, L. H., & Chang, H. (2011). Taiwan quality indicator project and hospital productivity growth. Omega, 39(1), 14–22.CrossRefGoogle Scholar
  5. Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Annals of Statistics, 7, 1–16.CrossRefGoogle Scholar
  6. Efron, B. (1982). The jackknife, the bootstrap and other re-sampling plans. In B. Efron (Ed.), Society of Industrial and Applied Mathematics CBMS-NSF Monographs, 38, Society for Industrial and Applied Mathematics:PhiladelphiaGoogle Scholar
  7. Efron, B., & Tibshirani, R. J. (1993). An Introduction to the Bootstrap. London: Chapman and Hall.CrossRefGoogle Scholar
  8. Emrouznejad, A., Parker, B. R., & Tavares, G. (2008). Evaluation of research in efficiency and productivity: A survey and analysis of the first 30 years of scholarly literature in DEA. Socio Economic Planning Sciences, 42(3), 151–157.CrossRefGoogle Scholar
  9. Färe, R., Grosskopf, S., Lindgren, B., & Ross, P. (1994). Productivity developments in Swedish hospital: A Malmquist output index approach (pp. 253–272). Boston: Kluwer.Google Scholar
  10. Field, K., & Emrouznejad, A. (2003). Measuring the performance of neonatal care units in scotland. Journal of Medical Systems, 27(4), 315–324.CrossRefGoogle Scholar
  11. Hatami-Marbini, A., Tavana, M., & Emrouznejad, A. (2012). Productivity growth and efficiency measurements in fuzzy environments with an application to health care. International Journal of Fuzzy System Applications, 2(2), 1–34.CrossRefGoogle Scholar
  12. Hollingsworth, B. (2008). Non-parametric and parametric applications measuring efficiency in health care. Health Economics, 17, 1107–1128.CrossRefGoogle Scholar
  13. Johnson, A. L., & McGinnis, L. F. (2008). Outlier detection in two-stage semiparametric DEA models. European Journal of Operational Research, 187(2), 629–635.CrossRefGoogle Scholar
  14. Kirigia, J. M., Emrouznejad, A., & Sambo, L. G. (2002). Measurement of technical efficiency of public hospitals in Kenya: Using data envelopment analysis. Journal of Medical Systems, 26(1), 39–45.CrossRefGoogle Scholar
  15. Kirigia, J. M., Emrouznejad, A., Sambo, L. G., Munguti, N., & Liambila, W. (2004). Using data envelopment analysis to measure the technical efficiency of public health centres in Kenya. Journal of Medical Systems, 28(2), 155–166.CrossRefGoogle Scholar
  16. Kirigia, J. M., Emrouznejad, A., Vaz, R. G., Bastiene, H., & Padayachy, J. (2008). A comparative assessment of performance and productivity of health centers in Seychelles. International Journal of Productivity and Performance Management, 57(1), 72–92.CrossRefGoogle Scholar
  17. Masiye, F., Kirigia, J. M., Emrouznejad, A., & Chimfwembe, D. (2003). Efficiency of health centres in Zambia: Using data envelopment analysis. WHO, Brazzaville, Congo: Mimeo.Google Scholar
  18. O’Neill, L., Rauner, M., Heidenberger, K., & Kraus, M. (2008). A cross-national comparison and taxonomy of DEA-based hospital efficiency studies. Socio Economic Planning Sciences, 42(3), 158–189.CrossRefGoogle Scholar
  19. Ouellette, P., & Vierstraete, V. (2004). Technological change and efficiency in the presence of quasi-fixed inputs: A DEA application to the hospital sector. European Journal of Operational Research, 154(3), 755–763.CrossRefGoogle Scholar
  20. Pastor, J. T., Ruiz, J. L., & Sirvent, I. (1999). A statistical test for detecting influential observations in DEA. European Journal of Operational Research, 115, 542–554.CrossRefGoogle Scholar
  21. Puig-Junoy, J. (2000). Partitioning input cost efficiency into its allocative and technical components: An empirical DEA application to hospitals. Socio-Economic Planning Sciences, 34(3), 199–218.CrossRefGoogle Scholar
  22. Seaver, B. L., & Triantis, K. P. (1989). The implications of using messy data to estimate production-frontierbased technical efficiency measures. Journal of Business and Economic Statistics, 7, 49–59.Google Scholar
  23. Simar, L. (1992). Estimating efficiencies from frontier models with panel data: A comparison of parametric, non-parametric and semi-parametric methods with bootstrapping. Journal of Productivity Analysis, 3, 167–203.CrossRefGoogle Scholar
  24. Simar, L. (1996). Aspects of statistical analysis in DEA-type frontiers models. Journal of Productivity Analysis, 7, 117–185.CrossRefGoogle Scholar
  25. Simar, L. (2003). Detecting outliers in frontier models: A simple approach. Journal of Productivity Analysis, 20, 391–424.CrossRefGoogle Scholar
  26. Simar, L., & Wilson, P. W. (1998). Sensitivity analysis of efficiency scores: How to bootstrap in nonparametric frontier models. Management Science, 44(1), 4961.CrossRefGoogle Scholar
  27. Simar, L., & Wilson, P. W. (2000). A general methodology for bootstrapping in non-parametric frontier models. Journal of Applied Statistics, 27(6), 779–802.CrossRefGoogle Scholar
  28. Simar, L., & Wilson, P. W. (2001). Testing restrictions in nonparametric frontier models. Communications in Statistics: Simulation and Computation, 30, 161–186.Google Scholar
  29. Simar, L., & Wilson, P. W. (2002). Nonparametric tests of returns to scale. European Journal of Operational Research, 139, 115–132.CrossRefGoogle Scholar
  30. Swanson Kazley, A., & Ozcan, Y. A. (2009). Electronic medical record use and efficiency: A DEA and windows analysis of hospitals. Socio Economic Planning Sciences, 43(3), 209–216.CrossRefGoogle Scholar
  31. Wilson, P. W. (1993). Detecting outliers in deterministic nonparametric frontier models with multiple outputs. Journal of Business and Economic Statistics, 11, 319–323.Google Scholar
  32. Wilson, P. W. (1995). Detects influential observations in data envelopment analysis. Journal of Productivity Analysis, 6, 27–45.CrossRefGoogle Scholar
  33. Yang, Z., Wang, X., & Sun, D. (2010). Using the bootstrap method to detect influential DMUs in data envelopment analysis. Annals of Operations Research, 173, 89–103.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Iran University of Science & Technology (IUST)TehranIran
  2. 2.Operations and Information Management Group, Aston Business SchoolAston UniversityBirminghamUK

Personalised recommendations