Journal of the Operational Research Society

, Volume 63, Issue 8, pp 1098–1115 | Cite as

Performance assessment of secondary schools: the snapshot of a country taken by DEA

General Paper

Abstract

This paper describes a performance assessment of Portuguese secondary schools using data envelopment analysis (DEA). The assessment adopts a perspective where schools are viewed as promoting students achievement given their characteristics in terms of academic abilities and socio-economic background. Our sample comprised all secondary schools in Portugal with both basic and secondary education levels. Two types of DEA analysis are performed: one using an output-oriented model that restricts output (exam scores) weights to be linked to the number of students that have done that exam in the school, and the other using a model that restricts factor weights to be equal for all schools. In this model the weight restrictions are linked to the total number of exams done nationally. The first model is well suited for identifying worst performing schools and to assess schools that may specialize in certain subjects, whereas the latter is best suited for improving discrimination between best performing schools when pursuing the identification of benchmarks, as well as to construct performance rankings.

Keywords

data envelopment analysis secondary schools weight restrictions single weights 

References

  1. Arnold V, Bardhan I, Cooper W and Kumbhakar S (1996). New uses of DEA and statistical regressions for efficiency evaluation and estimation—With an illustrative application to public secondary schools in Texas. Annals of Operations Research 66: 255–277.CrossRefGoogle Scholar
  2. Azevedo J (2005). Avaliação das escolas: Fundamentar modelos e operacionalizar processos. Conselho Nacional de Educação, http://www.oei.es/evaluacioneducativa/avaliacao_escolas_conselho_educacao_portugal.pdf.Google Scholar
  3. Banker RD, Janakiraman S and Natarajan R (2004). Analysis of trends in technical and allocative efficiency: An application to Texas public school districts. European Journal of Operational Research 154 (2): 477–491.CrossRefGoogle Scholar
  4. Barrow MM (2001). Measuring local education authority performance: A frontier approach. Economics of Education Review 10 (1): 19–27.CrossRefGoogle Scholar
  5. Bessent A and Bessent W (1980). Determining the comparative efficiency of schools through data envelopment analysis. Educational Administration Quarterly 16 (2): 57–75.CrossRefGoogle Scholar
  6. Bessent A, Bessent W, Kennington J and Reagen B (1982). An application of mathematical programming to assess productivity in the Houston independent school district. Management Science 28 (12): 1355–1367.CrossRefGoogle Scholar
  7. Bradley S, Johnes G and Millington J (2001). The effect of competition on the efficiency of secondary schools in England. European Journal of Operational Research 135: 545–568.CrossRefGoogle Scholar
  8. Charnes A, Cooper WW and Rhodes E (1978). Measuring the efficiency of decision making units. European Journal of Operational Research 2: 429–444.CrossRefGoogle Scholar
  9. Cook WD, Kazakov A, Roll Y and Seiford LM (1991). A data envelopment approach to measuring efficiency: Case analysis of highway maintenance patrols. The Journal of Socioeconomics 20 (1): 83–103.Google Scholar
  10. Cooper W, Ruiz J and Sirvent I (2007). Choosing weights from alternative optimal solutions of dual multiplier models in DEA. European Journal of Operational Research 180: 443–458.CrossRefGoogle Scholar
  11. De Witte K, Thanassoulis E, Simpson G, Battisti G and Charlesworth-May A (2010). Assessing pupil and school performance by non-parametric and parametric techniques. Journal of the Operational Research Society 61: 1224–1237.CrossRefGoogle Scholar
  12. Ferrão M (2009). Sensitivity of value added model specifications: Measuring socio-economic status. Revista de Educacin 348: 137–152.Google Scholar
  13. Ferrão M and Goldstein H (2009). Adjusting for measurement error in the value added model: Evidence from Portugal. Quality and Quantity 43: 951–963.CrossRefGoogle Scholar
  14. Gray J (1981). A competitive edge: Examination results and the probable limits of secondary school effectiveness. Educational Review 33 (1): 25–35.CrossRefGoogle Scholar
  15. Hanushek E (1986). The economics of schooling: Production and efficiency in public schools. Journal of Economic Literature XXIV: 1141–1177.Google Scholar
  16. Kane TJ and Staiger DO (2002). The promise and pitfalls of using imprecise school accountability measures. Journal of Economic Perspectives 16 (4): 91–114.CrossRefGoogle Scholar
  17. Kirjavainen T and Loikkanen HA (1998). Efficiency differences of Finnish senior secondary schools: An application of DEA and Tobit analysis. Economics of Education Review 17 (4): 377–394.CrossRefGoogle Scholar
  18. Kuosmanen T (2002). Modelling blank data entries in data envelopment analysis. EconWPA working paper at WUSTL, no. 0210001 (Econometrics).Google Scholar
  19. Kuosmanen T, Cherchye L and Sipilainen T (2006). The law of one price in data envelopment analysis: Restricting weight flexibility across firms. European Journal of Operational Research 170: 735–757.CrossRefGoogle Scholar
  20. MacBeath J and Mortimore P (2001). Improving School Effectiveness. Open University Press: Philadelphia, PA.Google Scholar
  21. Mancebón M and Bandrés E (1999). Efficiency evaluation in secondary schools: The key role of model specification and of ex post analysis of results. Education Economics 7 (2): 131–152.CrossRefGoogle Scholar
  22. Mayston D (2003). Measuring and managing educational performance. Journal of the Operational Research Society 54: 679–691.CrossRefGoogle Scholar
  23. Mizala A, Romaguera P and Farren D (2002). The technical efficiency of schools in Chile. Applied Economics 34: 1533–1552.CrossRefGoogle Scholar
  24. Muñiz MA (2002). Separating managerial inefficiency and external conditions in data envelopment analysis. European Journal of Operational Research 143: 625–643.CrossRefGoogle Scholar
  25. OECD (2005). Education at a Glance, OECD Indicators 2005—Executive Summary. OECD Publishing, http://www.oecd.org/document/34/0,3746,en_2649_39263238_35289570_1_1_1_1,00.html.
  26. OECD (2010). Education at a Glance 2010, OECD Indicators. OECD Publishing, http://browse.oecdbookshop.org/oecd/pdfs/free/9610071E.pdf.
  27. Oliveira MA and Santos C (2005). Assessing school efficiency in Portugal using FDH and bootstrapping. Applied Economics 37: 957–968.CrossRefGoogle Scholar
  28. Pereira MC and Moreira S (2007). A stochastic frontier analysis of secondary education output in Portugal. Estudos e Documentos de Trabalho, Banco de Portugal, Working Paper 6/2007.Google Scholar
  29. Portela M and Camanho A (2007). Performance assessment of Portuguese secondary schools. Universidade Catolica Portuguesa, Working Paper.Google Scholar
  30. Portela M and Camanho A (2010). Analysis of complementary methodologies for the estimation of school value added. Journal of the Operational Research Society 61: 1122–1132.CrossRefGoogle Scholar
  31. Portela M, Camanho A and Borges D (2011). BESP—Benchmarking of Portuguese secondary schools. Benchmarking: An International Journal 18 (2): 240–260.CrossRefGoogle Scholar
  32. Post T and Spronk J (1999). Performance benchmarking using iterative data envelopment analysis. European Journal of Operational Research 115: 472–487.CrossRefGoogle Scholar
  33. Primont DF and Domazlicky B (2006). Student achievement and efficiency in Missouri schools and the No Child Left Behind Act. Economics of Education Review 25: 77–90.CrossRefGoogle Scholar
  34. Roll Y, Cook WD and Golany B (1991). Controlling factor weights in data envelopment analysis. IIE Transactions 23 (1): 2–9.CrossRefGoogle Scholar
  35. Sarrico C and Rosa M (2009). Measuring and comparing the performance of Portuguese secondary schools: A confrontation between metric and practice benchmarking. International Journal of Productivity and Performance Management 58 (8): 767–786.CrossRefGoogle Scholar
  36. Sarrico C, Rosa M and Coelho I (2010). The performance of Portuguese secondary schools: An exploratory study. Quality Assurance in Education 18 (4): 286–303.CrossRefGoogle Scholar
  37. Sengupta JK (1987). Data envelopment analysis for efficiency measurement in the stochastic case. Computers and Operations Research 14 (2): 117–129.CrossRefGoogle Scholar
  38. Thanassoulis E, Portela M and Despic O (2007). DEA—The mathematical programming approach to efficiency analysis. In: Fried HO, Lovell CAK and Schmidt SS (eds). The Measurement of Productive Efficiency and Productivity Growth. Oxford University Press: New York, Oxford.Google Scholar
  39. Thompson RG, Singleton JFD, Thrall RM and Smith BA (1986). Comparative site evaluations for locating a high-energy physics lab in Texas. Interfaces 16: 35–49.CrossRefGoogle Scholar
  40. Thompson RG, Langemeier L, Lee C, Lee E and Thrall RM (1990). The role of multiplier bounds in efficiency analysis with application to Kansas farming. Journal of Econometrics 46: 93–108.CrossRefGoogle Scholar
  41. Worthington AC (2001). An empirical survey of frontier efficiency measurement techniques in education. Education Economics 9 (3): 245–268.CrossRefGoogle Scholar

Copyright information

© Operational Research Society 2011

Authors and Affiliations

  1. 1.Universidade Católica PortuguesaPortugal
  2. 2.Faculdade de Engenharia da Universidade do PortoPortugal

Personalised recommendations