Annals of Operations Research

, Volume 238, Issue 1–2, pp 475–496 | Cite as

A spatiotemporal Data Envelopment Analysis (S-T DEA) approach: the need to assess evolving units

  • Konstantinos Petridis
  • Alexander Chatzigeorgiou
  • Emmanouil Stiakakis
Article
  • 211 Downloads

Abstract

One of the major challenges in measuring efficiency in terms of resources and outcomes is the assessment of the evolution of units over time. Although Data Envelopment Analysis (DEA) has been applied for time series datasets, DEA models, by construction, form the reference set for inefficient units (lambda values) based on their distance from the efficient frontier, that is, in a spatial manner. However, when dealing with temporal datasets, the proximity in time between units should also be taken into account, since it reflects the structural resemblance among time periods of a unit that evolves. In this paper, we propose a two-stage spatiotemporal DEA (S-T DEA) approach, which captures both the spatial and temporal dimension through a multi-objective programming model. In the first stage, DEA is solved iteratively extracting for each unit only previous DMUs as peers in its reference set. In the second stage, the lambda values derived from the first stage are fed to a Multiobjective Mixed Integer Linear Programming model, which filters peers in the reference set based on weights assigned to the spatial and temporal dimension. The approach is demonstrated on a real-world example drawn from software development.

Keywords

Data Envelopment Analysis Efficiency OR in software Multiobjective programming Linear Programming 

References

  1. Andersen, P., & Petersen, N. C. (1993). A procedure for ranking efficient units in data envelopment analysis. Management Science, 39(10), 1261–1264.CrossRefGoogle Scholar
  2. Anderson, T. R., & Inman, L. (2011). Resolving the issue of multiple optima in Technology Forecasting using Data Envelopment Analysis. In IEEE Proceedings of PICMET’11 Technology Management in the Energy Smart World (PICMET) 2011, (pp. 1–5).Google Scholar
  3. Avkiran, N. K. (2009a). Opening the black box of efficiency analysis: An illustration with UAE banks. Omega, 37(4), 930–941.CrossRefGoogle Scholar
  4. Avkiran, N. K. (2009b). Removing the impact of environment with units-invariant efficient frontier analysis: An illustrative case study with intertemporal panel data. Omega, 37(3), 535–544.CrossRefGoogle Scholar
  5. Avkiran, N. K., & Rowlands, T. (2008). How to better identify the true managerial performance: State of the art using DEA. Omega, 36(2), 317–324.CrossRefGoogle Scholar
  6. Banker, R. D., Cooper, W. W., Seiford, L. M., & Zhu, J. (2011). Returns to scale in DEA. In W. W. Cooper, L. M. Seiford, & J. Zhu (Eds.), Handbook on Data Envelopment Analysis (pp. 41–70). New York: Springer.Google Scholar
  7. Bergendahl, G. (1998). DEA and benchmarks—An application to Nordic banks. Annals of Operations Research, 82, 233–250.CrossRefGoogle Scholar
  8. Brooke, A., Kendrick, D., Meeraus, A., & Raman, R. (2003). GAMS/CPLEX 9.0. user notes. Washington, DC: GAMS Development Corp.Google Scholar
  9. Cao, Y., & Yang, Z. (2009). Using DEA window analysis and the Malmquist index to evaluate the operations of Canadian Schedule I banks. In IEEE/INFORMS International Conference on Service Operations, Logistics and Informatics, 2009. SOLI’09, (pp. 353–358).Google Scholar
  10. Charnes, A., Clark, C. T., Cooper, W. W., & Golany, B. (1984). A developmental study of data envelopment analysis in measuring the efficiency of maintenance units in the US air forces. Annals of Operations Research, 2(1), 95–112.CrossRefGoogle Scholar
  11. Charnes, A., Cooper, W. W., & Rhodes, E. (1978). Measuring the efficiency of decision making units. European Journal of Operational Research, 2(6), 429–444.CrossRefGoogle Scholar
  12. Charnes, A., Cooper, W. W., & Rhodes, E. (1981). Evaluating program and managerial efficiency: An application of data envelopment analysis to program follow through. Management Science, 27(6), 668–697.CrossRefGoogle Scholar
  13. Chen, W.-C., & Johnson, A. L. (2010). The dynamics of performance space of Major League Baseball pitchers 1871–2006. Annals of Operations Research, 181(1), 287–302.CrossRefGoogle Scholar
  14. Chen, P.-C., & Yu, M.-M. (2014). Total factor productivity growth and directions of technical change bias: Evidence from 99 OECD and non-OECD countries. Annals of Operations Research, 214(1), 143–165.CrossRefGoogle Scholar
  15. Coelli, T. J., Prasada Rao, D. S., O’Donnell, C. J., & Battese, G. E. (2005). An introduction to efficiency and productivity analysis. New York: Springer.Google Scholar
  16. Cook, W. D., & Seiford, L. M. (2009). Data envelopment analysis (DEA)-Thirty years on. European Journal of Operational Research, 192(1), 1–17.CrossRefGoogle Scholar
  17. Cooper, W. W., Seiford, L. M., & Tone, K. (2007). Data envelopment analysis: A comprehensive text with models, applications, references and DEA-solver software. New York: Springer.Google Scholar
  18. Couto, C., Maffort, C., Garcia, R., & Valente, M. T. (2013). COMETS: A dataset for empirical research on software evolution using source code metrics and time series analysis. ACM SIGSOFT Software Engineering Notes, 38(1), 1–3.CrossRefGoogle Scholar
  19. Emrouznejad, A. (2003). An alternative DEA measure: A case of OECD countries. Applied Economics Letters, 10(12), 779–782.CrossRefGoogle Scholar
  20. Emrouznejad, A., & Thanassoulis, E. (2005). A mathematical model for dynamic efficiency using data envelopment analysis. Applied Mathematics and Computation, 160(2), 363–378.CrossRefGoogle Scholar
  21. Emrouznejad, A., & Thanassoulis, E. (2010). Measurement of productivity index with dynamic DEA. International Journal of Operational Research, 8(2), 247–260.CrossRefGoogle Scholar
  22. Färe, R., Grosskopf, S., & Lovell, C. K. (1994). Production frontiers. Cambridge, UK: Cambridge University Press.Google Scholar
  23. Freed, N., & Glover, F. (1981). Simple but powerful goal programming models for discriminant problems. European Journal of Operational Research, 7(1), 44–60.CrossRefGoogle Scholar
  24. Grifell-Tatjé, E., & Lovell, C. K. (1997). A DEA-based analysis of productivity change and intertemporal managerial performance. Annals of Operations Research, 73, 177–189.CrossRefGoogle Scholar
  25. Hashimoto, A., & Kodama, M. (1997). Has livability of Japan gotten better for 1956–1990?: A DEA approach. Social Indicators Research, 40(3), 359–373.CrossRefGoogle Scholar
  26. Inuiguchi, M., & Mizoshita, F. (2012). Qualitative and quantitative data envelopment analysis with interval data. Annals of Operations Research, 195(1), 189–220.CrossRefGoogle Scholar
  27. Jain, S., Triantis, K. P., & Liu, S. (2011). Manufacturing performance measurement and target setting: A data envelopment analysis approach. European Journal of Operational Research, 214(3), 616–626.CrossRefGoogle Scholar
  28. Lozano, S., & Villa, G. (2010). Gradual technical and scale efficiency improvement in DEA. Annals of Operations Research, 173(1), 123–136.CrossRefGoogle Scholar
  29. Lynde, C., & Richmond, J. (1999). Productivity and efficiency in the UK: A time series application of DEA. Economic Modelling, 16(1), 105–122.CrossRefGoogle Scholar
  30. Moreno, P., Lozano, S., & Gutiérrez, E. (2013). Dynamic performance analysis of US wireline telecommunication companies. Telecommunications Policy, 37(6–7), 469–482.CrossRefGoogle Scholar
  31. Movahedi, M. M., Saati, S., & Vahidi, A. R. (2007). Iranian railway efficiency (1971–2004): An application of DEA. International Journal of Contemporary Mathematical Sciences, 2, 1569–1579.Google Scholar
  32. Parnas, D. L. (1994). Software aging. In Proceedings of the 16th international conference on Software engineering, (pp. 279–287). Los Alamitos, CA, USA: IEEE Computer Society Press.Google Scholar
  33. Rosenthal, R. E. (2013). A GAMS tutorial. In GAMS Development Corporation (Eds.), GAMS: A user’s guide (pp. 7–28). Washington, DC, USA.Google Scholar
  34. Rutledge, R. W., Parsons, S., & Knaebel, R. (1995). Assessing hospital efficiency over time: An empirical application of data envelopment analysis. Journal of Information Technology Management, 6, 13–24.Google Scholar
  35. Samoladas, I., Gousios, G., Spinellis, D., & Stamelos, I. (2008). The SQO-OSS quality model: Measurement based open source software evaluation. In B. Russo, E. Damiani, S. Hissam, B. Lundell, & G. Succi (Eds.), Open source development, communities and quality (pp. 237–248). New York: Springer.CrossRefGoogle Scholar
  36. Seiford, L. M., & Zhu, J. (1998). On alternative optimal solutions in the estimation of returns to scale in DEA. European Journal of Operational Research, 108(1), 149–152.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Aston Business SchoolAston UniversityBirminghamUK
  2. 2.Department of Applied InformaticsUniversity of MacedoniaThessalonikiGreece

Personalised recommendations