Annals of Operations Research

, Volume 67, Issue 1, pp 141–161 | Cite as

Data envelopment analysis applied to quality in primary health care

  • Javier Salinas-Jiménez
  • Peter Smith
Article

Abstract

The performance of primary care should ultimately be judged on its effect on the health outcome of individual patients. However, for the foreseeable future, it is inconceivable that the outcome data necessary to come to a judgement on performance will be available. And in any case, specification of the statistical model necessary to analyze outcome is fraught with difficulty. This paper therefore sets out a model of primary care performance which is based on the premise that certain measurable quality indicators can act as proxies for outcome. This being the case, a model of performance can be deduced which takes into account the effect of resources and patient characteristics on outcome. The most appropriate analytic technique to make this model operational is data envelopment analysis (DEA). It is argued that DEA can handle multiple dimensions of performance more comfortably, and is less vulnerable to the misspecification bias that afflicts statistically based models. The issues are illustrated with an example from English Family Health Service Authorities.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ali, A.I. and L.M. Seiford, The mathematical programming approach to efficiency analysis, inThe Measurement of Productive Efficiency, H.O. Fried, C.A.K. Lovell and S.S. Schmidt, eds., Oxford University Press, Oxford, 1993.Google Scholar
  2. Atkin, K., N. Lunt, G. Parker and M. Hirst,Nurses Count: A National Census of Practice Nurses, Social Policy Research Unit, York, 1993.Google Scholar
  3. Baker, D. and R. Klein, Explaining outputs of primary health care: Population and practice, British Medical Journal 303, 1991, 225–229.Google Scholar
  4. Banker, R.D., A. Charnes and W.W. Cooper, Some models for estimating technical and scale inefficiencies in data envelopment analysis, Management Science 30, 1984.Google Scholar
  5. Banker, R.D., R. Conrad and R. Strauss, A comparative application of data envelopment analysis and translog method: An illustrative study of hospital production, Management Science 32, 1986, 30–44.Google Scholar
  6. Banker, R.D. and R.C. Morey, Efficiency analysis for exogenously fixed inputs and outputs, Operations Research 34, 1986, 513–521.Google Scholar
  7. Bowen, B. and L. Payling, Expert systems for performance review, Journal of the Operational Research Society 38, 1987, 929–934.Google Scholar
  8. Burgess, J.F. and P.W. Wilson, Technical efficiency in veterans administration hospitals, inThe Measurement of Productive Efficiency, H.O. Fried, C.A.K. Lovell and S.S. Schmidt, eds., Oxford University Press, Oxford, 1993.Google Scholar
  9. Carr-Hill, R.A., The measurement of patient satisfaction, Journal of Public Health Medicine 14, 1992, 236–249.Google Scholar
  10. Carr-Hill, R., G. Hardman, S. Martin, S. Peacock, T. Sheldon and P. Smith, A formula for distributing NHS revenues based on small area use of hospital beds, Centre for Health Economics, University of York, York, 1994.Google Scholar
  11. Charnes, A., W. Cooper and E. Rhodes, Measuring the efficiency of decision making units, European Journal of Operational Research 2, 1978, 429–444.Google Scholar
  12. Chilingerian, J.A., Evaluating physician efficiency in hospitals: a multivariate analysis of best practice, European Journal of Operational Research 80, 1995, 548–574.Google Scholar
  13. Department of Health,GP Practice Charters: Making Them Happen, Department of Health, London, 1993.Google Scholar
  14. Department of Health,Departmental Report, HMSO, London, 1994.Google Scholar
  15. Department of Health and Social Security,Promoting Better Health — The Government's Programme for Improving Primary Health Care (Cm 249), The Department, London, 1987.Google Scholar
  16. Färe, R., S. Grosskopf, B. Lindgren and P. Roos, Productivity changes in Swedish pharmacies 1980–1989: A non-parametric Malmquist approach, Journal of Productivity Analysis 3, 1992, 85–101.Google Scholar
  17. Fernandez-Castro, A. and P. Smith, Towards a general non-parametric model of corporate performance, Omega: International Journal of Management Science 22, 1994, 237–249.Google Scholar
  18. Ganley, J.A. and J.S. Cubbin,Public Sector Efficiency Measurement: Applications of Data Envelopment Analysis, North Holland, Amsterdam, 1992.Google Scholar
  19. Gatsonis, C., S.L. Normand, C. Liu and C. Morris, Geographic variation of procedure utilization: A hierarchical model approach, Medical Care 31, 1993, 54–59.Google Scholar
  20. Grosskopf, S. and V. Valdmanis, Measuring hospital performance — a non-parametric approach, Journal of Health Economics 6, 1987, 89–107.Google Scholar
  21. Harris, A., Developing a research and development strategy for primary care, British Medical Journal 306, 1993, 189–192.Google Scholar
  22. Hopton, J.L., J.G. Howie and A.M. Porter, The need for another look at the patient in general practice satisfaction surveys, Family Practice 10, 1993, 82–87.Google Scholar
  23. Huang, Y.L. and C.P. McLaughlin, Relative efficiency in rural primary health care: An application of data envelopment analysis, Health Services Research 24, 1989, 143–158.Google Scholar
  24. Huntington, J., From FPC to FHSA to ... health commission?, British Medical Journal 306, 1993, 33–36.Google Scholar
  25. Kornbluth, J.S.H., Analyzing policy effectiveness using cone restricted data envelopment analysis, Journal of Operational Research Society 42, 1991, 1097–1104.Google Scholar
  26. Morey, R.C., D.J. Fine, S.W. Loree, D.L. Retzlaff-Roberts and S. Tsubakitani, The trade-off between hospital cost and quality of care: An exploratory empirical analysis, Medical Care 30, 1992, 677–698.Google Scholar
  27. National Health Service Management Executive,Health Service Indicators Handbook, The Executive, London, 1992.Google Scholar
  28. Norman, M. and B. Stoker,Data Envelopment Analysis: The Assessment of Performance, Wiley, Chichester, 1991.Google Scholar
  29. Nunamaker, T.R., Measuring routine nursing service efficiency: A comparison of cost per patient day and data envelopment analysis models, Health Services Research 18, 1983, 183–205.Google Scholar
  30. Ozcan, Y.A., Sensitivity analysis of hospital efficiency under alternative output/input and peer groups: A review, Knowledge and Policy 5, 1992, 1–29.Google Scholar
  31. Ozcan, Y.A., Efficiency of hospital service production in local markets: The balance sheet of US medical armament, Socio-Economic Planning Sciences 29, 1995, 139–150.Google Scholar
  32. Ozcan, Y.A. and R.D. Luke, A national study of the efficiency of hospitals in urban markets, Health Services Research 27, 1993, 719–739.Google Scholar
  33. Ozcan, Y.A., R.D. Luke and C. Haksever, Ownership and organizational performance: A comparison of technical efficiency across hospital types, Medical Care 30, 1992, 781–794.Google Scholar
  34. Paterson, L. and H. Goldstein, New statistical methods for analysing social structures: An introduction to multilevel models, British Educational Research Journal 17, 1991, 387–393.Google Scholar
  35. Peter, L., J.B. Tate and P.J. Catchpole, Practice activity analysis: Collaboration between general practitioners and a family practitioner committee, Journal of the Royal College of General Practitioners 39, 1989, 297–299.Google Scholar
  36. Pina, V. and L. Torres, Evaluating the efficiency of nonprofit organizations: an application of data envelopment analysis to the public health services, Financial Accountability and Management 8, 1992, 213–225.Google Scholar
  37. Register, C.A. and E.R. Bruning, Profit incentives and technical efficiency in the production of hospital care, Southern Economic Journal 53, 1987, 899–914.Google Scholar
  38. Roll, Y. and B. Golany, Alternate methods of treating factor weights in DEA, International Journal of Management Science 21, 1993, 99–109.Google Scholar
  39. Rosenhead, J., Operational research in health services planning, European Journal of Operational Research 2, 1978, 75–85.Google Scholar
  40. Royal College of General Practitioners,Quality in General Practice: Policy Statement 2, The College, London, 1985.Google Scholar
  41. Schmidt, P. and C.A.K. Lovell, Estimating technical and allocative efficiency relative to stochastic production and cost frontiers, Journal of Econometrics 9, 1979, 343–366.Google Scholar
  42. Sengupta, J.K., Data envelopment analysis for efficiency measurement in the stochastic case, Computers and Operations Research 14, 1987, 117–129.Google Scholar
  43. Sexton, T.R., A.M. Leiken, A.H. Nolan, S. Liss, A. Hogan and R.H. Silkman, Evaluating managerial efficiency of veterans administration medical centers using data envelopment analysis, Medical Care 27, 1989, 1175–1188.Google Scholar
  44. Sherman, H.D., Hospital efficiency measurement and evaluation — empirical test of a new technique, Medical Care 22, 1984, 922–938.Google Scholar
  45. Smith, P., Negative political feedback: An examination of the problem of modelling political responses in public sector effectiveness auditing, Accounting, Auditing and Accountability Journal 5, 1992, 5–20.Google Scholar
  46. Smith, P., Model misspecification in data envelopment analysis, Annals of Operations Research (forthcoming).Google Scholar
  47. Thanassoulis, E., A. Boussofiane and R.G. Dyson, Exploring output quality targets in the provision of perinatal care in England using data envelopment analysis, European Journal of Operational Research 80, 1991, 588–607.Google Scholar
  48. Thompson, R.G., L.N. Langemeier, C. Lee, E. Lee and R.M. Thrall, The role of multiplier bounds in efficiency analysis with application to Kansas farming, Journal of Econometrics 46, 1990, 93–108.Google Scholar
  49. Thrall, R.M., Classification transitions under expansion of inputs and outputs in DEA, Managerial and Decision Economics 10, 1989, 159–162.Google Scholar
  50. UK Government,Working for Patients (Cm 555), HMSO, London, 1990a.Google Scholar
  51. UK Government,The Citizen's Charter (Cm 1599), HMSO, London, 1990b.Google Scholar
  52. UK Government,The Revenue Support Grant Distribution (Amendment) (No 2) Report (England), HMSO, London, 1992.Google Scholar
  53. Valdmanis, V., Sensitivity analysis for DEA models — an empirical example using public vs. NFP hospitals, Journal of Public Economics 48, 1992, 185–205.Google Scholar
  54. Williams, A., Economics of coronary artery bypass grafting, British Medical Journal 291, 1985, 326–329.Google Scholar
  55. Wong, Y.H.B. and J.E. Beasley, Restricting weight flexibility in data envelopment analysis, Journal of Operational Research Society, 41, 1990, 829–835.Google Scholar

Copyright information

© J.C. Baltzer AG, Science Publishers 1996

Authors and Affiliations

  • Javier Salinas-Jiménez
    • 1
  • Peter Smith
    • 2
  1. 1.Departamento de Economía AplicadaUniversidad de ExtremaduraSpain
  2. 2.Department of ManagementUniversity of St. AndrewsSt. AndrewsScotland

Personalised recommendations