Advertisement

MetricM: a modeling method in support of the reflective design and use of performance measurement systems

  • Stefan Strecker
  • Ulrich Frank
  • David Heise
  • Heiko Kattenstroth
Original Article

Abstract

Performance indicators play a key role in management practice. The existence of a coherent and consistent set of performance indicators is widely regarded as a prerequisite to making informed decisions in line with set objectives of the firm. Designing such a system of performance indicators requires a profound understanding of the relations between financial and non-financial metrics, organizational goals, aspired decision scenarios, and the relevant organizational context—including subtleties resulting from implicit assumptions and hidden agendas potentially leading to dysfunctional consequences connected with the ill-informed use of performance indicators. In this paper, we investigate whether a domain-specific modeling method can address requirements essential to the reflective design of performance measurement systems, and which structural and procedural features such a method entails. The research follows a design research process in which we describe a research artifact, and evaluate it to assess whether it meets intended goals and domain requirements. In the paper, we specify design goals, requirements and assumptions underlying the method construction, discuss the structural specification of the method and its design rationale, and provide an initial method evaluation. The results indicate that the modeling method satisfies the requirements of the performance measurement domain, and that such a method contributes to the reflective definition and interpretation of performance measurement systems.

Keywords

Performance measurement Enterprise modeling Metamodeling Domain-specific modeling language Method engineering Design research 

Notes

Acknowledgments

The authors would like to thank the three anonymous referees for their constructive comments which greatly helped to improve the manuscript. We would also like to thank S. Zelewski for his invaluable input on inter-goal relations, and we would like to acknowledge the contribution of H. Schauer to earlier work on a predecessor to MetricML.

References

  1. Abernethy MA, Horne M, Lillis AM, Malina MA, Selto FH (2005) A multi-method approach to building causal performance maps from expert knowledge. Manag Account Res 16(2):135–155CrossRefGoogle Scholar
  2. Aichele C (1997) Kennzahlenbasierte Geschäftsprozessanalyse. Gabler, WiesbadenCrossRefGoogle Scholar
  3. Atkinson AA, Waterhouse JH, Wells RB (1997) A stakeholder approach to strategic performance measurement. Sloan Manag Rev 38(3):25–37Google Scholar
  4. Bach V, Brecht L, Hess T, Österle H (1996) Enabling systematic business change: integrated methods and software tools for business process redesign. Vieweg, WiesbadenGoogle Scholar
  5. Bourne M, Mills J, Wilcox M, Neely A, Platts K (2000) Designing, implementing and updating performance measurement systems. Int J Oper Prod Manag 20(7):754–771CrossRefGoogle Scholar
  6. Bourne M, Neely A, Mills J, Platts K (2003) Implementing performance measurement systems: a literature review. Int J Bus Perform Manag 5(1):1–24CrossRefGoogle Scholar
  7. Bourne M, Kennerley M, Franco-Santos M (2005) Managing through measures: a study of impact on performance. Manage Sci 16(4):373–395Google Scholar
  8. Cardinaels E, van Veen-Dirks PMG (2010) Financial versus non-financial information: the impact of information organization and presentation in a balanced scorecard. Account Organ Soc 35(6):565–578CrossRefGoogle Scholar
  9. Chenhall RH, Langfield-Smith K (2007) Multiple perspectives of performance measures. Eur Manag J 25(4):266–282CrossRefGoogle Scholar
  10. Davies I, Green P, Rosemann M, Indulska M, Gallo S (2006) How do practitioners use conceptual modeling in practice?. Data Knowl Eng 58(3):358–380CrossRefGoogle Scholar
  11. De Haas M, Kleingeld A (1999) Multilevel design of performance measurement systems: enhancing strategic dialogue throughout the organization. Manag Account Res 10(3):233–261CrossRefGoogle Scholar
  12. Drucker PF (1954) The practice of management. Harper & Row, New YorkGoogle Scholar
  13. Eccles R, Pyburn P (1992) Creating a comprehensive system to measure performance. Manag Account 74(4):41–44Google Scholar
  14. Eccles RG (1991) The performance measurement manifesto. Harv Bus Rev 69(1):131–137Google Scholar
  15. Edwards JR, Bagozzi RP (2000) On the nature and direction of relationships between constructs and measures. Psychol Methods 5(2):155–174CrossRefGoogle Scholar
  16. Epstein M, Manzoni JF (1998) Implementing corporate strategy: from tableaux de bord to balanced scorecards. Eur Manag J 16(2):190–203CrossRefGoogle Scholar
  17. Fortuin L (1988) Performance indicators—why, where and how?. Eur J Operat Res 34(1):1–9CrossRefGoogle Scholar
  18. Frank U (1994) Multiperspektivische Unternehmensmodellierung: Theoretischer Hintergrund und Entwurf einer objektorientierten Entwicklungsumgebung. Oldenbourg, MünchenGoogle Scholar
  19. Frank U (1998) Essential research strategies in the information systems discipline—reflections on formalisation, contingency and the social construction of reality. Systemist 20:98–113Google Scholar
  20. Frank U (2002) Multi-perspective enterprise modeling (MEMO): conceptual framework and modeling languages. In: Proceedings of the 35th annual Hawaii international conference on system sciences (HICSS), Honululu, pp 72–82Google Scholar
  21. Frank U (2005) Contribution to empirical research strategies in conceptual modeling—silver bullet or academic toys. WIRTSCHAFTSINFORMATIK 47(2):153–154Google Scholar
  22. Frank U (2006) Evaluation of reference models. In: Fettke P, Loos P (eds) Reference modeling for business systems analysis. Idea Group, Hershey, PA, pp 118–140CrossRefGoogle Scholar
  23. Frank U (2006b) Towards a pluralistic conception of research methods in information systems research. ICB research report 7, institute for computer science and business information systems (ICB). University of Duisburg-Essen, GermanyGoogle Scholar
  24. Frank U (2008) The MEMO meta modelling language (MML) and language architecture. ICB Research Report 24, institute for computer science and business information systems (ICB). University of Duisburg-Essen, GermanyGoogle Scholar
  25. Frank U (2010a) Prolegomena of a method to guide the development of domain-specific modelling languages. Unpublished Manuscript; to appear in 2011Google Scholar
  26. Frank U (2010b) The MEMO organisation modelling language: an update. Unpublished Manuscript; to appear in 2011Google Scholar
  27. Frank U, Lange C (2007) E-MEMO: a method to support the development of customized electronic commerce systems. Inf Syst E-Bus Manag 5(2):93–116CrossRefGoogle Scholar
  28. Frank U, Heise D, Kattenstroth H, Schauer H (2008) Designing and utilising business indicator systems within enterprise models—outline of a method. In: Loos P, Nüttgens M, Turowski K, Werth D (eds) Modellierung betrieblicher Informationssysteme (MobIS 2008), GI, Bonn, Lecture Notes in Informatics, vol 141, pp 89–105Google Scholar
  29. Frank U, Heise D, Kattenstroth H (2009) Use of a domain specific modeling language for realizing versatile dashboards. In: Tolvanen JP, Rossi M, Gray J, Sprinkle J (eds) Proceedings of the 9th OOPSLA workshop on domain-specific modeling (DSM). Helsinki Business School, HelsinkiGoogle Scholar
  30. Gulden J, Frank U (2010) MEMOCenterNG—a full-featured modeling environment for organization modeling and model-driven software development. In: Soffer P, Proper E (eds) Proceedings of the CAiSE Forum 2010, Hammamet, Tunisia, June 9–11, 2010, CEUR workshop proceedings, vol 592, pp 76–83Google Scholar
  31. Habermas J (1968) Technik und Wissenschaft als Ideologie. In: Habermas J (ed) Technik und Wissenschaft als Ideologie. Frankfurt/M, pp 48–103Google Scholar
  32. Hauser J, Katz G (1998) Metrics: you are what you measure!. Eur Manag J 16(5):517–528CrossRefGoogle Scholar
  33. Henderson-Sellers B, Ralyté J (2010) Situational method engineering: state-of-the-art review. J Univ Comp Sci 16(3):424–478Google Scholar
  34. Hope J (2007) Beyond budgeting to the adaptive organization. In: Neely A (ed) Business performance measurement: unifying theories and integration practice, 2nd edn. Cambridge University Press, Cambridge, pp 163–178Google Scholar
  35. Kaplan RS (2010) Conceptual foundations of the balanced scorecard. Working Paper 10-074, Harvard Business School, Harvard University, Cambridge, Mass (paper originally prepared for C. Chapman, A. Hopwood, and M. Shields (eds) (2009) Handbook of management accounting research: vol 3. Elsevier)Google Scholar
  36. Kaplan RS, Norton DP (1992) The balanced scorecard: measures that drive performance. Harv Bus Rev 70:71–79Google Scholar
  37. Kaplan RS, Norton DP (1996) The balanced scorecard: translating strategy into action. Harvard Business School Press, BostonGoogle Scholar
  38. Kaplan RS, Norton DP (1996) Linking the balanced scorecard to strategy. Cal Manag Rev 39(1):53–80Google Scholar
  39. Kaplan RS, Norton DP (2004) Strategy maps: converting intangible assets into tangible outcomes. Harvard Business School Press, CambridgeGoogle Scholar
  40. Kelly S, Tolvanen JP (2008) Domain-specific modeling: enabling full code generation. Wiley, LondonGoogle Scholar
  41. Korherr B, List B (2007a) Extending the EPC and the BPMN with business process goals and performance measures. In: 9th International conference on enterprise information systems, ICEIS 2007, Funchal, Madeira, June 12–16, 2007, Revised Selected Papers, Citeseer, pp 287–294Google Scholar
  42. Korherr B, List B (2007b) Extending the EPC with performance measures. In: Proceedings of the 2007 ACM symposium on applied computing (ACMSAC’07). ACM, pp 1266–1267Google Scholar
  43. Kronz A (2005) Management von Prozesskennzahlen im Rahmen der ARIS-Methodik. In: Scheer AW, Jost W, Heß H, Kronz A (eds) Corporate performance management. Springer, Berlin, pp 31–44CrossRefGoogle Scholar
  44. Lankhorst M (2009) Enterprise architecture at work: modelling, communication and analysis, 2nd edn. Springer, BerlinCrossRefGoogle Scholar
  45. Lebas M, Euske K (2007) A conceptual and operational delineation of performance. In: Neely A (ed) Business performance measurement: unifying theories and integration practice, 2nd edn. Cambridge University Press, Cambridge, pp 125–139Google Scholar
  46. Lynch RL, Cross KF (1991) Measure up! The essential guide to measuring business performance. Mandarin, LondonGoogle Scholar
  47. Malina MA, Nørreklit HSO, Selto FH (2007) Relations among measures, climate of control, and performance measurement models. Contemp Account Res 24(3):935–982CrossRefGoogle Scholar
  48. Moers F (2005) Discretion and bias in performance evaluation: the impact of diversity and subjectivity. Account Organ Soc 30(1):67–80CrossRefGoogle Scholar
  49. Nagel E (1931) Measurement. Erkenntnis 2(1):313–335CrossRefGoogle Scholar
  50. Neely A, Gregory M, Platts K (1995) Performance measurement system design: a literature review and research agenda. Int J Operat Prod Manag 15(4):80–116CrossRefGoogle Scholar
  51. Neely A, Richards H, Mills J, Platts K, Bourne M (1997) Designing performance measures: a structured approach. Int J Operat Prod Manag 17:1131–1152CrossRefGoogle Scholar
  52. Neely A, Mills J, Platts K, Richards H, Gregory M, Bourne M, Kennerley M (2000) Performance measurement system design: developing and testing a process-based approach. Int J Operat Prod Manag 20(9/10):1119–1145CrossRefGoogle Scholar
  53. Neely A, Kennerley M, Adams C (2007) Performance measurement frameworks: a review. In: Neely A (ed) Business performance measurement: unifying theories and integration practice, 2nd edn.. Cambridge University Press, Cambridge, pp 143–162CrossRefGoogle Scholar
  54. Neely AD (2007) Business performance measurement, 2nd edn. Cambridge Univerdity Press, Cambridge, UKCrossRefGoogle Scholar
  55. Nørreklit H (2000) The balance on the balanced scorecard: a critical analysis of some of its assumptions. Manag Account Res 11:65–88CrossRefGoogle Scholar
  56. Nørreklit H (2003) The balanced scorecard: what is the score? A rhetorical analysis of the balanced scorecard. Account Organ Soc 28(6):591–619CrossRefGoogle Scholar
  57. Nørreklit H, Nørreklit L, Falconer M (2007) Theoretical conditions for validity in accounting performance measurement. In: Neely A (ed) Business performance measurement: unifying theories and integration practice, 2nd edn. Cambridge University Press, Cambridge, pp 179–217Google Scholar
  58. Nørreklit L, Nørreklit H, Israelsen P (2006) The validity of management control topoi: towards constructivist pragmatism. Manag Account Res 17(1):42–71CrossRefGoogle Scholar
  59. Ortner E (2008) Language-critical enterprise and software engineering. In: Proceedings of the fourteenth Americas conference of information systems, AMCIS 2008, Toronto, ON, Canada, Aug 14–17, 2008Google Scholar
  60. Österle H (1995) Business engineering. Springer, BerlinGoogle Scholar
  61. Österle H, Brenner C, Gaßner C, Gutzwiller T, Hess T (1996) Business engineering—Prozeß- und Systementwicklung. Band 2: Fallbeispiel. Springer, BerlinGoogle Scholar
  62. Österle H, Becker J, Frank U, Hess T, Karagiannis D, Krcmar H, Loos P, Mertens P, Oberweis A, Sinz EJ (2010) Memorandum on design-oriented information systems research. Eur J Inf Syst 20:7–10CrossRefGoogle Scholar
  63. Palpanas T, Chowdhary P, Mihaila G, Pinel F (2007) Integrated model-driven dashboard development. Inf Syst Front 9(2–3):195–208CrossRefGoogle Scholar
  64. Peffers K, Tuunanen T, Rothenberger MA, Chatterjee S (2007) A design science research methodology for information systems research. J Manag Inf Syst 24(3):45–77CrossRefGoogle Scholar
  65. Perrin B (1998) Effective use and misuse of performance measurement. Am J Eval 19(3):367–379CrossRefGoogle Scholar
  66. Pfeffer J (1981) Management as symbolic action. In: Cummings LL, Staw BM (eds) Research in organizational behavior, vol 3, JAI Press, Greenwich, Conn, pp 1–52Google Scholar
  67. Pike S, Roos G (2007) The validity of measurement frameworks: measurement theory. In: Neely A (ed) Business performance measurement: unifying theories and integration practice, 2nd edn. Cambridge University Press, Cambridge, pp 218–235Google Scholar
  68. Popova V, Sharpanskykh A (2010) Modeling organizational performance indicators. Inf Syst 35(4):505–527CrossRefGoogle Scholar
  69. Pourshahid A, Chen P, Amyot D, Weiss M, Forster A (2007) Business process monitoring and alignment: an approach based on the user requirements notation and business intelligence tools. In: 10th Workshop of requirement engineering (WERE’07), Toronto, pp 80–91Google Scholar
  70. Pourshahid A, Amyot D, Peyton L, Ghanavati S, Chen P, Weiss M, Forster AJ (2008) Toward an integrated user requirements notation framework and tool for business process management. In: 2008 International MCETECH conference on e-technologies, pp 3–15Google Scholar
  71. Ridgway VF (1956) Dysfunctional consequences of performance measurements. Adm Sci Q 1(2):240–247CrossRefGoogle Scholar
  72. Rolland C (2007) Method engineering: trends and challenges. In: Ralyté J, Brinkkemper S, Henderson-Sellers B (eds) Situational method engineering, vol 244. Springer, IFIP, Berlin, p 6Google Scholar
  73. Ronaghi F (2005) A modeling method for integrated performance management. In: Proceedings of the sixteenth international workshop on database and expert systems applications (DEXA’05), pp 972–976Google Scholar
  74. Rosanas JM (2008) Beyond economic criteria: a humanistic approach to organizational survival. J Bus Ethics 78(3):447–462CrossRefGoogle Scholar
  75. Rosanas JM, Velilla M (2005) The ethics of management control systems: developing technical and moral values. J Bus Ethics 57(1):83–96CrossRefGoogle Scholar
  76. Rossi M, Ramesh B, Lyytinen K, Tolvanen JP (2004) Managing evolutionary method engineering by method rationale. J Assoc Inf Syst 5(9):356–391Google Scholar
  77. Scheer AW (1992) Architecture of integrated information systems: foundations of enterprise modelling. Springer, BerlinCrossRefGoogle Scholar
  78. Scheer AW (2000) ARIS: business process modeling, 3rd edn. Springer, Berlin, HeidelbergCrossRefGoogle Scholar
  79. Simon HA (1964) On the concept of organizational goal. Adm Sci Quart 9(1):1–22CrossRefGoogle Scholar
  80. Simons R (1995) Levers of control: how managers use innovative control systems to drive strategic renewal. Harvard Business School Press, Boston, MassGoogle Scholar
  81. Simons R, Davila A, Kaplan RS (2000) Performance measurement & control systems for implementing strategy. Prentice Hall, Englewood CliffsGoogle Scholar
  82. Smith Stevens S (1959) Measurement. In: Churchman CW (ed) Measurement: definitions and theories. Wiley, New York, pp 18–36Google Scholar
  83. Speckbacher G, Bischof J, Pfeiffer T (2003) A descriptive analysis on the implementation of balanced scorecards in german-speaking countries. Manage Account Res 14(4):361–388CrossRefGoogle Scholar
  84. Strembeck M, Zdun U (2009) An approach for the systematic development of domain-specific languages. Softw Pract Exper 39(15):1253–1292CrossRefGoogle Scholar
  85. Taticchi P (2010) Business performance measurement and management: new contexts, themes and challenges. Springer, BerlinGoogle Scholar
  86. Townley B, Cooper DJ, Oakes L (2003) Performance measures and the rationalization of organizations. Organ Stud 24(7):1045CrossRefGoogle Scholar
  87. Tuomela TS (2005) The interplay of different levers of control: a case study of introducing a new performance measurement system. Manag Account Res 16(3):293–320CrossRefGoogle Scholar
  88. Verschuren P, Hartog R (2005) Evaluation in design-oriented research. Qual Quant 39(6):733–762CrossRefGoogle Scholar
  89. von Wright GH (1971) Explanation and understanding. Cornell University Press, IthacaGoogle Scholar
  90. Wand Y, Weber R (2002) Research commentary: information systems and conceptual modeling—a research agenda. Inf Syst Res 13(4):363–376CrossRefGoogle Scholar
  91. Wand Y, Monarchi DE, Parsons J, Woo CC (1995) Theoretical foundations for conceptual modelling in information systems development. Dec Supp Syst 15(4):285–304CrossRefGoogle Scholar
  92. Weick KE (1979) Cognitive processes in organizations. In: Staw BM (ed) Research in organizational behavior, vol 1. JAI Pres, Greenwich, Conn, pp 41–74Google Scholar
  93. Weick KE (1980) The social psychology of organizing, 2nd edn. McGraw-Hill Higher Education, New YorkGoogle Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  • Stefan Strecker
    • 1
  • Ulrich Frank
    • 1
  • David Heise
    • 1
  • Heiko Kattenstroth
    • 1
  1. 1.Information Systems and Enterprise Modelling Research Group, Institute for Computer Science and Business Information SystemsUniversity of Duisburg-EssenEssenGermany

Personalised recommendations