Systemic Practice and Action Research

, Volume 22, Issue 1, pp 15–30

Implementing and Evaluating Performance Measurement Initiative in Public Leisure Facilities: An Action Research Project

Original Paper

Abstract

This paper addresses issues when implementing and evaluating performance measurement initiative. Applying action research, the study began with designing an innovative performance measurement system and applied the system in three public leisure centres in England during 2005–2007. It was found that, for practitioners in the public leisure sector, inclusiveness and simplicity are the most important criteria of a good performance measurement system. That is, not only does performance data need to be inclusive, the analytical process also needs to be simple and understandable. In addition, facility managers’ analytical skills and motivations for benchmarking are two factors determining the practicability of the system developed. Finally, while conducting a long-term action research, there is a need to continually communicate the actual and potential benefits of the change with senior managers. If they are committed and enthusiastic, it is easier to gain the support of other levels of the organisation.

Keywords

Action research Performance measurement system Benchmarking Leisure facility Data envelopment analysis 

References

  1. Al-Shammari M, Salimi A (1998) Modeling the operating efficiency of banks: a nonparametric methodology. Logist Inform Manage 11(1):5–17CrossRefGoogle Scholar
  2. Ammons DN (1999) A proper mentality for benchmarking. Public Admin Rev 59(2):105–109CrossRefGoogle Scholar
  3. Andrews R (2004) Analysing deprivation and local authority performance: the implications for CPA. Public Money Manage 1:19–26CrossRefGoogle Scholar
  4. Athanassopoulos AD, Ballantine JA (1995) Ratio and frontier analysis for assessing corporate performance: evidence from the grocery industry in the UK. J Oper Res Soc 46:427–440CrossRefGoogle Scholar
  5. Ball A, Bowerman M, Hawksworth S (2000) Benchmarking in local government under a central government agenda. Benchmarking Int J 7(1):20–34CrossRefGoogle Scholar
  6. Bauer J, Tanner SJ, Neely A (2004) Developing a performance measurement audit template – a benchmarking study. Meas Bus Excellence 8(4):17–25CrossRefGoogle Scholar
  7. Bernstein DJ (2001) Local government measurement use to focus on performance and results. Eval Program Plann 24:95–101CrossRefGoogle Scholar
  8. Bourne M, Mills J, Wilcox M, Neely A, Platts K (2000) Designing, implementing and updating performance measurement systems. Int J Oper Prod Manage 20(7):754–771CrossRefGoogle Scholar
  9. Bourne M, Neely A, Platts P, Mills J (2002) The success and failure of performance measurement initiatives: perceptions of participating managers. Int J Oper Prod Manage 22(11):1288–1310CrossRefGoogle Scholar
  10. Boussofiance A, Dyson RG, Thanassoulis E (1991) Applied data envelopment analysis. Eur J Oper Res 52:1–15CrossRefGoogle Scholar
  11. Bowerman M, Ball A, Francis G (2002) The evolution of benchmarking in UK local authorities. Benchmarking Int J 9(5):429–449CrossRefGoogle Scholar
  12. Broadbent J (2003) Comprehensive performance assessment: the crock of gold at the end of the performance rainbow? Public Money Manage 1:5–9CrossRefGoogle Scholar
  13. Carr W, Kemmis S (1983) Becoming critical: knowing through action research. Deakin University, GeelongGoogle Scholar
  14. Charnes A, Cooper WW (eds) (1994) Data envelopment analysis: theory, methodology and applications. Kluwer, BostonGoogle Scholar
  15. Charnes A, Cooper WW, Rhodes E (1978) Measuring the efficiency of decision making units. Eur J Oper Res 2:429–444CrossRefGoogle Scholar
  16. Chen WC (2003) Integrating approaches to efficiency and productivity measurement. PhD Thesis, Georgia Institute of Technology, AtlantaGoogle Scholar
  17. Eisenhardt KM (1989) Building theories from case study research. Acad Manage Rev 14(4):532–550CrossRefGoogle Scholar
  18. Farrell MJ (1957) The measurement of productive efficiency. J Roy Stat Soc 120(3):253–290CrossRefGoogle Scholar
  19. Foot J (1998) How to do benchmarking: a practitioner’s guide. Inter-Authorities Group, LondonGoogle Scholar
  20. Frizelle G (1991) Deriving a methodology for implementing CAPM systems. Int J Prod Manage 11(7):6–26CrossRefGoogle Scholar
  21. Greenwood DJ, Levin M (1998) Introduction to action research: social research for social change. Sage Publications, LondonGoogle Scholar
  22. Hinton M, Francis G, Holloway J (2000) Best practice benchmarking in the UK. Int J Benchmarking 7(1):52–61CrossRefGoogle Scholar
  23. Kaplan RS, Norton DP (1992) The balanced scorecard: measures that drive performance. Harvard Bus Rev 1:71–79Google Scholar
  24. Kennerley MP, Neely AD (2000) Performance measurement frameworks – a review. In: Proceedings of the 2nd international conference on performance measurement, Cambridge, pp 291–298Google Scholar
  25. Kotter JP (1995) Leading change: why transformation efforts fail? Harvard Bus Rev 73(2):59–67Google Scholar
  26. Kouzmin A, Loffer E, Klages H, Korac-Kakabadse N (1999) Benchmarking and performance measurement in public sectors: towards learning for agency effectiveness. Int J Public Sector Manage 12(2):121–144CrossRefGoogle Scholar
  27. Magd H, Curry A (2003) Benchmarking: achieving best value in public-sector organisations. Benchmarking Int J 10(3):261–286CrossRefGoogle Scholar
  28. Martilla J, James J (1977) Importance-performance analysis. J Marketing 41(1):77–79CrossRefGoogle Scholar
  29. Mori (2002) Public service reform: measuring and understanding customer satisfaction. Mori, LondonGoogle Scholar
  30. Neely A, Bourne M (2000) Why measurement initiatives fail? Meas Bus Excellence 4(4):3–6CrossRefGoogle Scholar
  31. Neely A, Mills J, Platts K, Richards H, Gregory M, Bourne M, Kennerley M (2000) Performance measurement system design: developing and testing a process-based approach. Int J Oper Prod Manage 20(10):1119–1145CrossRefGoogle Scholar
  32. Neely A, Richards AH, Mills JF, Platts KW, Bourne MS (1997) Designing performance measures: a structured approach. Int J Oper Prod Manage 17(11):1131–1153CrossRefGoogle Scholar
  33. O’Neill M, Palmer A (2004) Importance-performance analysis: a useful tool for directing continuous quality improvement in higher education. Qual Assur Educ 12(1):39–52CrossRefGoogle Scholar
  34. Ogden SM, Booth D (2001) Benchmarking of public leisure services: a tool for efficiency, effectiveness or equity? In: McPherson G, Reid G (eds) Leisure and social inclusion: new challenges for policy and provision. LSA, Eastbourne, LSA Publication No. 73, pp 103–122Google Scholar
  35. Ogden S, Wilson P (2000) Bridging the quality gaps: implementing benchmarking to deliver best value. Public Manage Int J Res Theory 2(4):525–546Google Scholar
  36. Oh H (2001) Revisiting importance-performance analysis. Tourism Manage 22(6):617–627CrossRefGoogle Scholar
  37. Robinson L, Taylor P (2003) The performance of local authority sports halls and swimming pools in England. Manag Leis 8(1):1–16CrossRefGoogle Scholar
  38. Sarrico CS, Dyson RG (2004) Restricting virtual weights in data envelopment analysis. Eur J Oper Res 159:17–34CrossRefGoogle Scholar
  39. Sekaran U (1992) Research methods for business, 2nd edn. Wiley, New YorkGoogle Scholar
  40. Smith P (1990) Data envelopment analysis applied to financial statements. OMEGA 18:131–138CrossRefGoogle Scholar
  41. Smith P, Street A (2005) Measuring the efficiency of public services: the limits of analysis. J Roy Stat Soc A 168(2):401–417CrossRefGoogle Scholar
  42. Spottiswoode C (2000) Improving police performance: a new approach to measuring police efficiency. HM Treasury Public Services Productivity Panel, LondonGoogle Scholar
  43. Staat M, Maik H (2000) Benchmarking the health sector in Germany: an application of data envelopment analysis, Research Paper. Institute for Market-Oriented Management, University of Mannheim, MannheimGoogle Scholar
  44. Susman GI, Evered RD (1978) An assessment of the scientific merits of action research. Admin Sci Quart 23:582–603CrossRefGoogle Scholar
  45. Taylor P, Godfrey A (2003) Performance measurement in English local authority sport facilities. Public Performance Manage Rev 26(3):251–262CrossRefGoogle Scholar
  46. Thanassoulis E, Boussofiane A, Dyson RG (1996) A comparison of data envelopment analysis and ratio analysis as tools for performance assessment. Omega 24(3):229–244CrossRefGoogle Scholar
  47. Voss C, Tsikriktsis N, Frohlich M (2002) Case research in operations management. Int J Oper Prod Manage 22(2):195–219CrossRefGoogle Scholar
  48. West J, Oldfather P (1995) Pooled case comparison an innovation for cross-case study. Qual Inq 1(4):452–464CrossRefGoogle Scholar
  49. Worthington AC (1998) The application of mathematical programming techniques to financial statement analysis: Australian gold production and exploration. Aust J Manage 23(1):97–114CrossRefGoogle Scholar
  50. Yin R (1994) Case study research: design and methods, 2nd edn. Sage Publishing, Beverly HillsGoogle Scholar
  51. Young ST (1992) Multiple productivity measurement approaches for management. Health Care Manage Rev 17(2):51–58Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  1. 1.Department of Leisure and Recreation ManagementKainan UniversityTaoyuanTaiwan

Personalised recommendations