Skip to main content

Advertisement

Log in

Government secondary school finances in New South Wales: accounting for students’ prior achievements in a two-stage DEA at the school level

  • Published:
Journal of Productivity Analysis Aims and scope Submit manuscript

Abstract

This study measures the efficiency of government secondary schools in New South Wales, Australia, using a two-stage semi-parametric production frontier approach to schooling. In contrast to previous research comparing school performance with two-stage data envelopment analysis (DEA), we control for prior academic achievement of students by using a rich data set from 2008 to 2010. We employ detailed financial data for deriving the envelope for the efficient production frontier of the schools. Using Simar and Wilson’s (J Econ 136:31-64, 2007, J Prod Anal 36:205-218, 2011a) double bootstrap procedure for two-stage DEA, the study finds that schools with lower total student numbers, a higher average of years of service of teachers, a higher ratio of special education students that attracts extra government funding, and girls only do better than other schools. On the other hand, a negative influence comes from a school’s location in provincial and outer metropolitan areas. An important result is that the socio-economic background of students attending a school has no significant effect on their academic performance, whereas higher prior academic achievements have a positive and statistically significant impact on student achievement. These results are relevant to decision makers for the school sector, in particular for funding criteria contained in the Gonski (Review of funding for schooling - Final report (December). Canberra: Commonwealth Government of Australia, 2011) review report.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. For example, Perry and McConney (2010) did not include financial school data in their analysis of students’ academic achievements in Australian schools in relation to socio economic status data. Mante and O’Brien (2002) calculated measures of technical efficiency of Australian schools in Victoria but again without including financial data on schools. On the other hand, Lamb et al. (2004) included funding from various sources for Victorian schools in regressions explaining student achievement scores.

  2. For an introduction to DEA analysis with numerous applications, along with computer codes, to Australian school data, see Blackburn et al. (2014a).

  3. For a longitudinal study on non-government (private) Australian schools in the state of Victoria, see Marks (2015a). Also, analysing school-level data instead of student-level data can lead to misleading results if relevant student factors are omitted, as discussed for example by Marks (2010, p. 269) for socio-economic status when prior student academic achievements are not controlled for.

  4. A related paper by one of the authors (Chakraborty and Blackburn, 2013) used only a subset of the data that we use in this paper and it also applied DEA that is subject to the critique of Simar and Wilson (2007, 2011a), whereas our DEA here is not.

  5. Worthington (2001) provided a general explanation of DEA and a review of empirical studies applying it to schools. Cook and Seiford (2009) surveyed DEA advancements over a thirty-year period.

  6. This ignores the possibility of having different production functions across some types of schools. However, Australian Government Schools follow a similar “production process” as they operate in similar regulatory environments. We did not include in our analysis non-government schools because they likely follow a different production process in this sense. Also, the number of observations for boys-only and girls-only schools (18 and 22 respectively) in our sample is too small for reliable inference with separate production functions.

  7. They (p. 19) pointed out that their value-added index avoids endogeneity problems due to student performance test scores reflecting student and family characteristics beyond the control of schools, when these are not controlled for.

  8. Grosskopf et al. (2014b) clarified that using expenditures as a proxy for input quantities is valid only when all observations face the same input prices.

  9. Changes were introduced in NSW in 2012.

  10. See, for example, Grosskopf et al. (2014a, b).

  11. Grosskopf et al. (2014a) employed instead cost functions and estimated for this purpose a hedonic wage function model, for which we have insufficient data.

  12. We use an input orientation because our outputs are not under the direct control of the decision makers, whereas the inputs are.

  13. See also Simar and Wilson (2011b).

  14. Using instead 200 observations did not affect the results in any meaningful way. This is consistent with Simar and Wilson (2007, p. 14), who found that 100 replications are “typically sufficient.”

  15. At Year 10, the last year of compulsory schooling, the school’s median test result is reported for the examination average over five subjects.

  16. Table 1 provides detailed statistics for all test scores that we use in our analysis.

  17. We considered variable returns to scale in the DEA, however, the algorithm did not lead to convergence.

  18. We include a dummy variable for selective schools to control for schools that pick students based on academic quality.

  19. Lamb et al. (2004, p. 29) pointed out, in the context of schools in Victoria, that the cohort two years earlier contains many of the same students. Miller and Voon (2011) also discussed the importance of this issue but could not follow our approach due to data unavailability. They included instead in their study Year 3 achievements in 2009 as a proxy for 2009 Year 5 students’ prior academic achievement and stated (p. 377) that “… our measure should be viewed as only a crude proxy for prior academic achievement.”

  20. In principle, DEA could identify just one efficient school, which would suggest an extreme outlier.

  21. We do not report the results in order to conserve space.

  22. Recently, Daraio et al. (2016) developed a formal empirical testing procedure for the separability condition in Simar and Wilson (2007) based on new central limit theorem results that they derived. In addition, they also proposed conditional efficiency estimators for the case when separability is rejected. We leave the test application for future research. Our approach here is instead an explorative and descriptive analysis for checking the separability condition.

References

  • ACARA (2010) NAPLAN Achievement in reading, writing, language conventions and numeracy: Report for 2010. Australian Curriculum Assessment and Reporting Authority, Sydney

    Google Scholar 

  • Alexander RJ, Haug AA, Jaforullah M (2010) A two-stage double-bootstrap data envelopment analysis of efficiency differences of New Zealand secondary schools. J Prod Anal 34:99–110

    Article  Google Scholar 

  • Bădin L, Daraio C, Simar L (2012) How to measure the impact of environmental factors in a nonparametric production model. European J Operat Res 223:818–833

    Article  Google Scholar 

  • Blackburn V, Brennan S, Ruggiero J (2014a) Nonparametric estimation of educational production and costs using data envelopment analysis. International Series in Operations Research & Management Science, Springer, New York, Volume 214

    Book  Google Scholar 

  • Blackburn V, Brennan S, Ruggiero J (2014b) Measuring efficiency in Australian schools: a preliminary analysis. Socio-Econ Plan Sci 48:4–9

    Article  Google Scholar 

  • Bradley S, Draca M, Green C (2004) School performance in Australia: is there a role for quasi-markets. The Australian Econ Rev 37:271–286

    Article  Google Scholar 

  • Chakraborty K, Blackburn V (2013) Efficiency and equity in funding for government schools in Australia. Australia Econ Papers 52:127–142

    Article  Google Scholar 

  • Charnes A, Cooper WW, Thrall RM (1986) Identifying and classifying scale and technical efficiencies in oberved data via data envelopment analysis. Oper Res Lett 5:105–110

  • Cook WW, Seiford LM (2009) Data envelopment analysis (DEA) – Thirty years on. European J Operat Res 192:1–17

    Article  Google Scholar 

  • Daraio C, Simar L, Wilson PW (2016) Nonparametric estimation of efficiency in the presence of environmental variables. Technical Report N. 2, La Sapienza, University of Rome, Rome

    Google Scholar 

  • Farrell MJ (1957) The measurement of productive efficiency. J R Stat Soc Series A 120:253–281

    Article  Google Scholar 

  • Gonski D (2011) Review of funding for schooling - Final report (December). Commonwealth Government of Australia, Canberra

    Google Scholar 

  • Grosskopf S, Hayes KJ, Taylor LL (2014a) Efficiency in education: research and implications. Appl Econ Perspect and Policy 36:175–210

    Article  Google Scholar 

  • Grosskopf S, Hayes KJ, Taylor LL (2014b) Applied efficiency analysis in education. Econ and Bus Lett 3:19–26

    Article  Google Scholar 

  • Hanushek EA (2013) Economic growth in developing countries: the role of human capital. Econ of Edu 37:204–212

    Article  Google Scholar 

  • Hanushek EA, Woessmann L (2007) The role of education quality in economic growth. World Bank Policy Research Working Paper 4122

  • Hanushek EA (2005) The economics of school quality. German Econ Rev 6:269–286

    Article  Google Scholar 

  • Hanushek EA (2003) The failure of input-based schooling policies. Econ J 113:F64–F98

    Article  Google Scholar 

  • Hanushek EA (1986) The economics of schooling: production and efficiency in public schools. J Econ Lit 24:1141–1177

    Google Scholar 

  • Kneip A, Park BU, Simar L (1998) A note on the convergence of nonparametric DEA estimates for production efficiency scores. Econ Theory 14:783–793

    Article  Google Scholar 

  • Lamb S, Rumberger R, Jesson D, Teese R (2004) School performance in Australia: Results from analyses of school effectiveness. Centre for Post-Compulsory Education and Lifelong Learning, University of Melbourne, Melbourne, Report for the Victorian Department of Premier and Cabinet

    Google Scholar 

  • Mante B, O’Brien G (2002) Efficiency measurement of Australian public sector organizations: the case of state secondary schools in Victoria. J Edu Admin 40:274–296

    Article  Google Scholar 

  • Marks GN (2015a) Do catholic and independent schools “add-value” to students’ tertiary entrance performance? Evidence from longitudinal population data. Australian J Edu 59:133–157

    Article  Google Scholar 

  • Marks GN (2015b) Are school-SES effects statistical artefacts? Evidence from longitudinal population data. Oxford Rev of Edu 41:122–144

    Article  Google Scholar 

  • Marks GN (2010) What aspects of schooling are important? School effects on tertiary entrance performance. Sch Eff Sch Improv 21:267–287

    Article  Google Scholar 

  • Marks GN, McMillan J, Hillman K (2001) Tertiary entrance performance: the role of student background and school factors. LSAY Research Reports. Longitudinal Surveys of Australian Youth Research Report No. 22, http://research.acer.edu.au/lsay_research/24

  • Miller PW, Voon D (2014) School outcomes in New South Wales and Queensland: a regression discontinuity approach. Edu Econ 22:427–448

    Article  Google Scholar 

  • Miller PW, Voon D (2011) Lessons from my school. Australian Econ Rev 44:366–386

    Article  Google Scholar 

  • Mok M, Flynn M (1996) School size and academic achievement in the HSC examination: is there a relationship. Issues in Edu Leadership 6:57–78

    Google Scholar 

  • Oaxaca R (1973) Male–female wage differentials in urban labor markets. Internat Econ Rev 14:693–709

    Article  Google Scholar 

  • Perry LB, McConney A (2010) Does the SES of the school matter? An examination of socioeconomic status and student achievement using PISA 2003. Teach Coll Rec 112:1137–1162

    Google Scholar 

  • Productivity Commission - Report on Government Services (2014) School education. Volume B, Attachment Table 4A, Tables 4A.1-4A.135. Retrieved on 12 March 2016 at: http://www.pc.gov.au/research/ongoing/report-on-government-services/2014/child-care,-education-and-training/download-the-volume/rogs-2014-volumeb-child-care-education-and-training.pdf

  • Shephard RW (1953) Cost and prod functions. Princeton University Press, Princeton

    Google Scholar 

  • Simar L (1996) Aspects of statistical analysis in DEA-type frontier models. J Prod Anal 7:177–185

    Article  Google Scholar 

  • Simar L, Wilson PW (2011a) Two-stage DEA: Caveat emptor. J Prod Anal 36:205–218

    Article  Google Scholar 

  • Simar L, Wilson PW (2011b) Inference by the m out of n bootstrap in nonparametric frontier models. J Prod Anal 36:33–53

    Article  Google Scholar 

  • Simar L, Wilson PW (2007) Estimation and inference in two-stage semi-parametric models of production processes. J Econ 136:31–64

    Article  Google Scholar 

  • Simar L, Wilson PW (1998) Sensitivity analysis of efficiency scores: how to bootstrap in nonparametric frontier models. Manage Sci 44:49–61

    Article  Google Scholar 

  • Wilson PW (2008) FEAR 1.0: A software package for frontier efficiency analysis with R. Soc-Econ Plan Sci 42:247–254

    Article  Google Scholar 

  • Worthington AC (2001) An empirical survey of frontier efficiency measurement techniques in education. Edu Econ 9:245–268

    Article  Google Scholar 

Download references

Acknowledgements

This article replaces an earlier unpublished version titled “Efficiency Aspects of Government Secondary School Finances in New South Wales: Results from a Two-Stage Double-Bootstrap DEA at the School Level” that did not control for prior academic achievements. The authors thank, without implicating, colleagues and anonymous referees for very helpful comments that improved the paper considerably.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alfred A. Haug.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Disclaimer

The views expressed in this paper are those of the authors and not those of the Department of Education and Communities.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haug, A.A., Blackburn, V.C. Government secondary school finances in New South Wales: accounting for students’ prior achievements in a two-stage DEA at the school level. J Prod Anal 48, 69–83 (2017). https://doi.org/10.1007/s11123-017-0502-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11123-017-0502-x

Keywords

JEL Classification

Navigation